WorldWideScience

Sample records for optimal experiment design

  1. Optimal design of isotope labeling experiments.

    Science.gov (United States)

    Yang, Hong; Mandy, Dominic E; Libourel, Igor G L

    2014-01-01

    Stable isotope labeling experiments (ILE) constitute a powerful methodology for estimating metabolic fluxes. An optimal label design for such an experiment is necessary to maximize the precision with which fluxes can be determined. But often, precision gained in the determination of one flux comes at the expense of the precision of other fluxes, and an appropriate label design therefore foremost depends on the question the investigator wants to address. One could liken ILE to shadows that metabolism casts on products. Optimal label design is the placement of the lamp; creating clear shadows for some parts of metabolism and obscuring others.An optimal isotope label design is influenced by: (1) the network structure; (2) the true flux values; (3) the available label measurements; and, (4) commercially available substrates. The first two aspects are dictated by nature and constrain any optimal design. The second two aspects are suitable design parameters. To create an optimal label design, an explicit optimization criterion needs to be formulated. This usually is a property of the flux covariance matrix, which can be augmented by weighting label substrate cost. An optimal design is found by using such a criterion as an objective function for an optimizer. This chapter uses a simple elementary metabolite units (EMU) representation of the TCA cycle to illustrate the process of experimental design of isotope labeled substrates.

  2. Cost Optimal System Identification Experiment Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    the experiment design are not based on obtained experimental data. Instead the decisions are based on the expected experimental data assumed to be obtained from the measurements, estimated based on prior information and engineering judgement. The design method provides a system identification experiment design...

  3. Optimized Experiment Design for Marine Systems Identification

    DEFF Research Database (Denmark)

    Blanke, M.; Knudsen, Morten

    1999-01-01

    Simulation of maneuvring and design of motion controls for marine systems require non-linear mathematical models, which often have more than one-hundred parameters. Model identification is hence an extremely difficult task. This paper discusses experiment design for marine systems identification...

  4. Assay optimization: a statistical design of experiments approach.

    Science.gov (United States)

    Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B

    2007-03-01

    With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.

  5. Optimal Design of Experiments Subject to Correlated Errors

    OpenAIRE

    Pazman, Andrej; Müller, Werner

    2000-01-01

    In this paper we consider optimal design of experiments in the case of correlated observations, when no replications are possible. This situation is typical when observing a random process or random field with known covariance structure. We present a theorem which demonstrates that the computation of optimum exact designs corresponds to solving minimization problems in terms of design measures. (author's abstract)

  6. Global Optimization Problems in Optimal Design of Experiments in Regression Models

    NARCIS (Netherlands)

    Boer, E.P.J.; Hendrix, E.M.T.

    2000-01-01

    In this paper we show that optimal design of experiments, a specific topic in statistics, constitutes a challenging application field for global optimization. This paper shows how various structures in optimal design of experiments problems determine the structure of corresponding challenging global

  7. Optimal experiment design revisited: fair, precise and minimal tomography

    CERN Document Server

    Nunn, J; Puentes, G; Lundeen, J S; Walmsley, I A

    2009-01-01

    Given an experimental set-up and a fixed number of measurements, how should one take data in order to optimally reconstruct the state of a quantum system? The problem of optimal experiment design (OED) for quantum state tomography was first broached by Kosut et al. [arXiv:quant-ph/0411093v1]. Here we provide efficient numerical algorithms for finding the optimal design, and analytic results for the case of 'minimal tomography'. We also introduce the average OED, which is independent of the state to be reconstructed, and the optimal design for tomography (ODT), which minimizes tomographic bias. We find that these two designs are generally similar. Monte-Carlo simulations confirm the utility of our results for qubits. Finally, we adapt our approach to deal with constrained techniques such as maximum likelihood estimation. We find that these are less amenable to optimization than cruder reconstruction methods, such as linear inversion.

  8. Application of optimal design methodologies in clinical pharmacology experiments.

    Science.gov (United States)

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development.

  9. Optimal experiment design for identification of grey-box models

    DEFF Research Database (Denmark)

    Sadegh, Payman; Melgaard, Henrik; Madsen, Henrik

    1994-01-01

    Optimal experiment design is investigated for stochastic dynamic systems where the prior partial information about the system is given as a probability distribution function in the system parameters. The concept of information is related to entropy reduction in the system through Lindley's measure...... estimation results in a considerable reduction of the experimental length. Besides, it is established that the physical knowledge of the system enables us to design experiments, with the goal of maximizing information about the physical parameters of interest....

  10. Optimal Design of Shock Tube Experiments for Parameter Inference

    KAUST Repository

    Bisetti, Fabrizio

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  11. Optimizing an experimental design for an electromagnetic experiment

    Science.gov (United States)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  12. Optimal Experiment Design for Thermal Characterization of Functionally Graded Materials

    Science.gov (United States)

    Cole, Kevin D.

    2003-01-01

    The purpose of the project was to investigate methods to accurately verify that designed , materials meet thermal specifications. The project involved heat transfer calculations and optimization studies, and no laboratory experiments were performed. One part of the research involved study of materials in which conduction heat transfer predominates. Results include techniques to choose among several experimental designs, and protocols for determining the optimum experimental conditions for determination of thermal properties. Metal foam materials were also studied in which both conduction and radiation heat transfer are present. Results of this work include procedures to optimize the design of experiments to accurately measure both conductive and radiative thermal properties. Detailed results in the form of three journal papers have been appended to this report.

  13. Optimal experiment design for time-lapse traveltime tomography

    Energy Technology Data Exchange (ETDEWEB)

    Ajo-Franklin, J.B.

    2009-10-01

    Geophysical monitoring techniques offer the only noninvasive approach capable of assessing both the spatial and temporal dynamics of subsurface fluid processes. Increasingly, permanent sensor arrays in boreholes and on the ocean floor are being deployed to improve the repeatability and increase the temporal sampling of monitoring surveys. Because permanent arrays require a large up-front capital investment and are difficult (or impossible) to re-configure once installed, a premium is placed on selecting a geometry capable of imaging the desired target at minimum cost. We present a simple approach to optimizing downhole sensor configurations for monitoring experiments making use of differential seismic traveltimes. In our case, we use a design quality metric based on the accuracy of tomographic reconstructions for a suite of imaging targets. By not requiring an explicit singular value decomposition of the forward operator, evaluation of this objective function scales to problems with a large number of unknowns. We also restrict the design problem by recasting the array geometry into a low dimensional form more suitable for optimization at a reasonable computational cost. We test two search algorithms on the design problem: the Nelder-Mead downhill simplex method and the Multilevel Coordinate Search algorithm. The algorithm is tested for four crosswell acquisition scenarios relevant to continuous seismic monitoring, a two parameter array optimization, several scenarios involving four parameter length/offset optimizations, and a comparison of optimal multi-source designs. In the last case, we also examine trade-offs between source sparsity and the quality of tomographic reconstructions. One general observation is that asymmetric array lengths improve localized image quality in crosswell experiments with a small number of sources and a large number of receivers. Preliminary results also suggest that high-quality differential images can be generated using only a small

  14. Design and optimization of reverse-transcription quantitative PCR experiments.

    Science.gov (United States)

    Tichopad, Ales; Kitchen, Rob; Riedmaier, Irmgard; Becker, Christiane; Ståhlberg, Anders; Kubista, Mikael

    2009-10-01

    Quantitative PCR (qPCR) is a valuable technique for accurately and reliably profiling and quantifying gene expression. Typically, samples obtained from the organism of study have to be processed via several preparative steps before qPCR. We estimated the errors of sample withdrawal and extraction, reverse transcription (RT), and qPCR that are introduced into measurements of mRNA concentrations. We performed hierarchically arranged experiments with 3 animals, 3 samples, 3 RT reactions, and 3 qPCRs and quantified the expression of several genes in solid tissue, blood, cell culture, and single cells. A nested ANOVA design was used to model the experiments, and relative and absolute errors were calculated with this model for each processing level in the hierarchical design. We found that intersubject differences became easily confounded by sample heterogeneity for single cells and solid tissue. In cell cultures and blood, the noise from the RT and qPCR steps contributed substantially to the overall error because the sampling noise was less pronounced. We recommend the use of sample replicates preferentially to any other replicates when working with solid tissue, cell cultures, and single cells, and we recommend the use of RT replicates when working with blood. We show how an optimal sampling plan can be calculated for a limited budget. .

  15. Optimal Experimental Design of Furan Shock Tube Kinetic Experiments

    KAUST Repository

    Kim, Daesang

    2015-01-07

    A Bayesian optimal experimental design methodology has been developed and applied to refine the rate coefficients of elementary reactions in Furan combustion. Furans are considered as potential renewable fuels. We focus on the Arrhenius rates of Furan + OH ↔ Furyl-2 + H2O and Furan ↔ OH Furyl-3 + H2O, and rely on the OH consumption rate as experimental observable. A polynomial chaos surrogate is first constructed using an adaptive pseudo-spectral projection algorithm. The PC surrogate is then exploited in conjunction with a fast estimation of the expected information gain in order to determine the optimal design in the space of initial temperatures and OH concentrations.

  16. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    Science.gov (United States)

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  17. Optimized design and analysis of sparse-sampling FMRI experiments.

    Science.gov (United States)

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase

  18. Structural Optimization of a Force Balance Using a Computational Experiment Design

    Science.gov (United States)

    Parker, P. A.; DeLoach, R.

    2002-01-01

    This paper proposes a new approach to force balance structural optimization featuring a computational experiment design. Currently, this multi-dimensional design process requires the designer to perform a simplification by executing parameter studies on a small subset of design variables. This one-factor-at-a-time approach varies a single variable while holding all others at a constant level. Consequently, subtle interactions among the design variables, which can be exploited to achieve the design objectives, are undetected. The proposed method combines Modern Design of Experiments techniques to direct the exploration of the multi-dimensional design space, and a finite element analysis code to generate the experimental data. To efficiently search for an optimum combination of design variables and minimize the computational resources, a sequential design strategy was employed. Experimental results from the optimization of a non-traditional force balance measurement section are presented. An approach to overcome the unique problems associated with the simultaneous optimization of multiple response criteria is described. A quantitative single-point design procedure that reflects the designer's subjective impression of the relative importance of various design objectives, and a graphical multi-response optimization procedure that provides further insights into available tradeoffs among competing design objectives are illustrated. The proposed method enhances the intuition and experience of the designer by providing new perspectives on the relationships between the design variables and the competing design objectives providing a systematic foundation for advancements in structural design.

  19. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    Science.gov (United States)

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  20. A unified modeling approach for physical experiment design and optimization in laser driven inertial confinement fusion

    Energy Technology Data Exchange (ETDEWEB)

    Li, Haiyan [Mechatronics Engineering School of Guangdong University of Technology, Guangzhou 510006 (China); Huang, Yunbao, E-mail: Huangyblhy@gmail.com [Mechatronics Engineering School of Guangdong University of Technology, Guangzhou 510006 (China); Jiang, Shaoen, E-mail: Jiangshn@vip.sina.com [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China); Jing, Longfei, E-mail: scmyking_2008@163.com [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China); Tianxuan, Huang; Ding, Yongkun [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China)

    2015-11-15

    Highlights: • A unified modeling approach for physical experiment design is presented. • Any laser facility can be flexibly defined and included with two scripts. • Complex targets and laser beams can be parametrically modeled for optimization. • Automatically mapping of laser beam energy facilitates targets shape optimization. - Abstract: Physical experiment design and optimization is very essential for laser driven inertial confinement fusion due to the high cost of each shot. However, only limited experiments with simple structure or shape on several laser facilities can be designed and evaluated in available codes, and targets are usually defined by programming, which may lead to it difficult for complex shape target design and optimization on arbitrary laser facilities. A unified modeling approach for physical experiment design and optimization on any laser facilities is presented in this paper. Its core idea includes: (1) any laser facility can be flexibly defined and included with two scripts, (2) complex shape targets and laser beams can be parametrically modeled based on features, (3) an automatically mapping scheme of laser beam energy onto discrete mesh elements of targets enable targets or laser beams be optimized without any additional interactive modeling or programming, and (4) significant computation algorithms are additionally presented to efficiently evaluate radiation symmetry on the target. Finally, examples are demonstrated to validate the significance of such unified modeling approach for physical experiments design and optimization in laser driven inertial confinement fusion.

  1. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  2. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    Science.gov (United States)

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  3. Optimal Design of Experiments for Parametric Identification of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    Optimal Systems of experiments for parametric identification of civil engineering structures is investigated. Design of experiments for parametric identification of dynamic systems is usually done by minimizing a scalar measure, e.g the determinant, the trace ect., of an estimated parameter...

  4. Optimization design of the stratospheric airship's power system based on the methodology of orthogonal experiment

    Institute of Scientific and Technical Information of China (English)

    Jian LIU; Quan-bao WANG; Hai-tao ZHAO; Ji-an CHEN; Ye QIU; Deng-ping DUAN

    2013-01-01

    The optimization design of the power system is essential for stratospheric airships with paradoxical requirements of high reliability and low weight.The methodology of orthogonal experiment is presented to deal with the problem of the optimization design of the airship's power system.Mathematical models of the solar array,regenerative fuel cell,and power management subsystem (PMS) are presented.The basic theory of the method of orthogonal experiment is discussed,and the selection of factors and levels of the experiment and the choice of the evaluation function are also revealed.The proposed methodology is validated in the optimization design of the power system of the ZhiYuan-2 stratospheric airship.Results show that the optimal configuration is easily obtained through this methodology.Furthermore,the optimal configuration and three sub-optimal configurations are in the Pareto frontier of the design space.Sensitivity analyses for the weight and reliability of the airship's power system are presented.

  5. Cogging torque optimization in surface-mounted permanent-magnet motors by using design of experiment

    Energy Technology Data Exchange (ETDEWEB)

    Abbaszadeh, K., E-mail: Abbaszadeh@kntu.ac.ir [Department of Electrical Engineering, K.N. Toosi University of Technology, Tehran (Iran, Islamic Republic of); Rezaee Alam, F.; Saied, S.A. [Department of Electrical Engineering, K.N. Toosi University of Technology, Tehran (Iran, Islamic Republic of)

    2011-09-15

    Graphical abstract: Magnet segment arrangement in cross section view of one pole for PM machine. Display Omitted Highlights: {yields} Magnet segmentation is an effective method for the cogging torque reduction. {yields} We have used the magnet segmentation method based on the design of experiment. {yields} We have used the RSM design of the design of experiment method. {yields} We have solved optimization via surrogate models like the polynomial regression. {yields} A significant reduction of the cogging torque is obtained by using RSM. - Abstract: One of the important challenges in design of the PM electrical machines is to reduce the cogging torque. In this paper, in order to reduce the cogging torque, a new method for designing of the motor magnets is introduced to optimize of a six pole BLDC motor by using design of experiment (DOE) method. In this method the machine magnets consist of several identical segments which are shifted to a definite angle from each other. Design of experiment (DOE) methodology is used for a screening of the design space and for the generation of approximation models using response surface techniques. In this paper, optimization is often solved via surrogate models, that is, through the construction of response surface models (RSM) like polynomial regression. The experiments were performed based on the response surface methodology (RSM), as a statistical design of experiment approach, in order to investigate the effect of parameters on the response variations. In this investigation, the optimal shifting angles (factors) were identified to minimize the cogging torque. A significant reduction of cogging torque can be achieved with this approach after only a few evaluations of the coupled FE model.

  6. Discriminating between rival biochemical network models: three approaches to optimal experiment design

    Directory of Open Access Journals (Sweden)

    August Elias

    2010-04-01

    Full Text Available Abstract Background The success of molecular systems biology hinges on the ability to use computational models to design predictive experiments, and ultimately unravel underlying biological mechanisms. A problem commonly encountered in the computational modelling of biological networks is that alternative, structurally different models of similar complexity fit a set of experimental data equally well. In this case, more than one molecular mechanism can explain available data. In order to rule out the incorrect mechanisms, one needs to invalidate incorrect models. At this point, new experiments maximizing the difference between the measured values of alternative models should be proposed and conducted. Such experiments should be optimally designed to produce data that are most likely to invalidate incorrect model structures. Results In this paper we develop methodologies for the optimal design of experiments with the aim of discriminating between different mathematical models of the same biological system. The first approach determines the 'best' initial condition that maximizes the L2 (energy distance between the outputs of the rival models. In the second approach, we maximize the L2-distance of the outputs by designing the optimal external stimulus (input profile of unit L2-norm. Our third method uses optimized structural changes (corresponding, for example, to parameter value changes reflecting gene knock-outs to achieve the same goal. The numerical implementation of each method is considered in an example, signal processing in starving Dictyostelium amœbæ. Conclusions Model-based design of experiments improves both the reliability and the efficiency of biochemical network model discrimination. This opens the way to model invalidation, which can be used to perfect our understanding of biochemical networks. Our general problem formulation together with the three proposed experiment design methods give the practitioner new tools for a systems

  7. A differentiable reformulation for E-optimal design of experiments in nonlinear dynamic biosystems.

    Science.gov (United States)

    Telen, Dries; Van Riet, Nick; Logist, Flip; Van Impe, Jan

    2015-06-01

    Informative experiments are highly valuable for estimating parameters in nonlinear dynamic bioprocesses. Techniques for optimal experiment design ensure the systematic design of such informative experiments. The E-criterion which can be used as objective function in optimal experiment design requires the maximization of the smallest eigenvalue of the Fisher information matrix. However, one problem with the minimal eigenvalue function is that it can be nondifferentiable. In addition, no closed form expression exists for the computation of eigenvalues of a matrix larger than a 4 by 4 one. As eigenvalues are normally computed with iterative methods, state-of-the-art optimal control solvers are not able to exploit automatic differentiation to compute the derivatives with respect to the decision variables. In the current paper a reformulation strategy from the field of convex optimization is suggested to circumvent these difficulties. This reformulation requires the inclusion of a matrix inequality constraint involving positive semidefiniteness. In this paper, this positive semidefiniteness constraint is imposed via Sylverster's criterion. As a result the maximization of the minimum eigenvalue function can be formulated in standard optimal control solvers through the addition of nonlinear constraints. The presented methodology is successfully illustrated with a case study from the field of predictive microbiology.

  8. Real-time PCR probe optimization using design of experiments approach.

    Science.gov (United States)

    Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F

    2016-03-01

    Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.

  9. Accuracy optimization of high-speed AFM measurements using Design of Experiments

    DEFF Research Database (Denmark)

    Tosello, Guido; Marinello, F.; Hansen, Hans Nørgaard;

    2010-01-01

    , the estimated dimensions of measured features. The definition of scan settings is based on a comprehensive optimization that targets maximization of information from collected data and minimization of measurement uncertainty and scan time. The Design of Experiments (DOE) technique is proposed and applied...

  10. Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.

    Science.gov (United States)

    Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana

    2015-03-11

    Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells.

  11. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  12. Reliability-Based Optimal Design of Experiment Plans for Offshore Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.

    1993-01-01

    Design of cost optimal experiment plans on the basis of a preposterior analysis is discussed. In particular the planning of on-site response measurements on offshore structures in order to update probabilistic models for fatigue life estimation is addressed. Special emphasis is given to modelling...

  13. Design Optimization of PZT-Based Piezoelectric Cantilever Beam by Using Computational Experiments

    Science.gov (United States)

    Kim, Jihoon; Park, Sanghyun; Lim, Woochul; Jang, Junyong; Lee, Tae Hee; Hong, Seong Kwang; Song, Yewon; Sung, Tae Hyun

    2016-08-01

    Piezoelectric energy harvesting is gaining huge research interest since it provides high power density and has real-life applicability. However, investigative research for the mechanical-electrical coupling phenomenon remains challenging. Many researchers depend on physical experiments to choose devices with the best performance which meet design objectives through case analysis; this involves high design costs. This study aims to develop a practical model using computer simulations and to propose an optimized design for a lead zirconate titanate (PZT)-based piezoelectric cantilever beam which is widely used in energy harvesting. In this study, the commercial finite element (FE) software is used to predict the voltage generated from vibrations of the PZT-based piezoelectric cantilever beam. Because the initial FE model differs from physical experiments, the model is calibrated by multi-objective optimization to increase the accuracy of the predictions. We collect data from physical experiments using the cantilever beam and use these experimental results in the calibration process. Since dynamic analysis in the FE analysis of the piezoelectric cantilever beam with a dense step size is considerably time-consuming, a surrogate model is employed for efficient optimization. Through the design optimization of the PZT-based piezoelectric cantilever beam, a high-performance piezoelectric device was developed. The sensitivity of the variables at the optimum design is analyzed to suggest a further improved device.

  14. Media milling process optimization for manufacture of drug nanoparticles using design of experiments (DOE).

    Science.gov (United States)

    Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj

    2015-01-01

    Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.

  15. Optimal Design of Passive Flow Control for a Boundary-Layer-Ingesting Offset Inlet Using Design-of-Experiments

    Science.gov (United States)

    Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.

    2006-01-01

    This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine

  16. Accuracy optimization of high-speed AFM measurements using Design of Experiments

    DEFF Research Database (Denmark)

    Tosello, Guido; Marinello, F.; Hansen, Hans Nørgaard

    2010-01-01

    , the estimated dimensions of measured features. The definition of scan settings is based on a comprehensive optimization that targets maximization of information from collected data and minimization of measurement uncertainty and scan time. The Design of Experiments (DOE) technique is proposed and applied......Atomic Force Microscopy (AFM) is being increasingly employed in industrial micro/nano manufacturing applications and integrated into production lines. In order to achieve reliable process and product control at high measuring speed, instrument optimization is needed. Quantitative AFM measurement...

  17. Optimizing drug delivery systems using systematic "design of experiments." Part I: fundamental aspects.

    Science.gov (United States)

    Singh, Bhupinder; Kumar, Rajiv; Ahuja, Naveen

    2005-01-01

    Design of an impeccable drug delivery product normally encompasses multiple objectives. For decades, this task has been attempted through trial and error, supplemented with the previous experience, knowledge, and wisdom of the formulator. Optimization of a pharmaceutical formulation or process using this traditional approach involves changing one variable at a time. Using this methodology, the solution of a specific problematic formulation characteristic can certainly be achieved, but attainment of the true optimal composition is never guaranteed. And for improvement in one characteristic, one has to trade off for degeneration in another. This customary approach of developing a drug product or process has been proved to be not only uneconomical in terms of time, money, and effort, but also unfavorable to fix errors, unpredictable, and at times even unsuccessful. On the other hand, the modern formulation optimization approaches, employing systematic Design of Experiments (DoE), are extensively practiced in the development of diverse kinds of drug delivery devices to improve such irregularities. Such systematic approaches are far more advantageous, because they require fewer experiments to achieve an optimum formulation, make problem tracing and rectification quite easier, reveal drug/polymer interactions, simulate the product performance, and comprehend the process to assist in better formulation development and subsequent scale-up. Optimization techniques using DoE represent effective and cost-effective analytical tools to yield the "best solution" to a particular "problem." Through quantification of drug delivery systems, these approaches provide a depth of understanding as well as an ability to explore and defend ranges for formulation factors, where experimentation is completed before optimization is attempted. The key elements of a DoE optimization methodology encompass planning the study objectives, screening of influential variables, experimental designs

  18. Design optimization of RF lines in vacuum environment for the MITICA experiment

    Energy Technology Data Exchange (ETDEWEB)

    De Muri, Michela, E-mail: michela.demuri@igi.cnr.it [INFN-LNL, v.le dell’Università 2, I-35020 Legnaro, PD (Italy); Consorzio RFX, Corso Stati Uniti, 4, I-35127 Padova (Italy); Pavei, Mauro; Rossetto, Federico; Marcuzzi, Diego [Consorzio RFX, Corso Stati Uniti, 4, I-35127 Padova (Italy); Miorin, Enrico; Deambrosis, Silvia M. [IENI-CNR, Corso Stati Uniti, 4, I-35127 Padova (Italy)

    2016-02-15

    This contribution regards the Radio Frequency (RF) transmission line of the Megavolt ITER Injector and Concept Advancement (MITICA) experiment. The original design considered copper coaxial lines of 1″ 5/8, but thermal simulations under operating conditions showed maximum temperatures of the lines at regime not compatible with the prescription of the component manufacturer. Hence, an optimization of the design was necessary. Enhancing thermal radiation and increasing the conductor size were considered for design optimization: thermal analyses were carried out to calculate the temperature of MITICA RF lines during operation, as a function of the emissivity value and of other geometrical parameters. Five coating products to increase the conductor surface emissivity were tested, measuring the outgassing behavior of the selected products and the obtained emissivity values.

  19. Design optimization of RF lines in vacuum environment for the MITICA experiment

    Science.gov (United States)

    De Muri, Michela; Pavei, Mauro; Rossetto, Federico; Marcuzzi, Diego; Miorin, Enrico; Deambrosis, Silvia M.

    2016-02-01

    This contribution regards the Radio Frequency (RF) transmission line of the Megavolt ITER Injector and Concept Advancement (MITICA) experiment. The original design considered copper coaxial lines of 1″ 5/8, but thermal simulations under operating conditions showed maximum temperatures of the lines at regime not compatible with the prescription of the component manufacturer. Hence, an optimization of the design was necessary. Enhancing thermal radiation and increasing the conductor size were considered for design optimization: thermal analyses were carried out to calculate the temperature of MITICA RF lines during operation, as a function of the emissivity value and of other geometrical parameters. Five coating products to increase the conductor surface emissivity were tested, measuring the outgassing behavior of the selected products and the obtained emissivity values.

  20. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  1. Optimal Design for Informative Protocols in Xenograft Tumor Growth Inhibition Experiments in Mice.

    Science.gov (United States)

    Lestini, Giulia; Mentré, France; Magni, Paolo

    2016-09-01

    Tumor growth inhibition (TGI) models are increasingly used during preclinical drug development in oncology for the in vivo evaluation of antitumor effect. Tumor sizes are measured in xenografted mice, often only during and shortly after treatment, thus preventing correct identification of some TGI model parameters. Our aims were (i) to evaluate the importance of including measurements during tumor regrowth and (ii) to investigate the proportions of mice included in each arm. For these purposes, optimal design theory based on the Fisher information matrix implemented in PFIM4.0 was applied. Published xenograft experiments, involving different drugs, schedules, and cell lines, were used to help optimize experimental settings and parameters using the Simeoni TGI model. For each experiment, a two-arm design, i.e., control versus treatment, was optimized with or without the constraint of not sampling during tumor regrowth, i.e., "short" and "long" studies, respectively. In long studies, measurements could be taken up to 6 g of tumor weight, whereas in short studies the experiment was stopped 3 days after the end of treatment. Predicted relative standard errors were smaller in long studies than in corresponding short studies. Some optimal measurement times were located in the regrowth phase, highlighting the importance of continuing the experiment after the end of treatment. In the four-arm designs, the results showed that the proportions of control and treated mice can differ. To conclude, making measurements during tumor regrowth should become a general rule for informative preclinical studies in oncology, especially when a delayed drug effect is suspected.

  2. Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty.

    Science.gov (United States)

    Mdluli, Thembi; Buzzard, Gregery T; Rundell, Ann E

    2015-09-01

    This model-based design of experiments (MBDOE) method determines the input magnitudes of an experimental stimuli to apply and the associated measurements that should be taken to optimally constrain the uncertain dynamics of a biological system under study. The ideal global solution for this experiment design problem is generally computationally intractable because of parametric uncertainties in the mathematical model of the biological system. Others have addressed this issue by limiting the solution to a local estimate of the model parameters. Here we present an approach that is independent of the local parameter constraint. This approach is made computationally efficient and tractable by the use of: (1) sparse grid interpolation that approximates the biological system dynamics, (2) representative parameters that uniformly represent the data-consistent dynamical space, and (3) probability weights of the represented experimentally distinguishable dynamics. Our approach identifies data-consistent representative parameters using sparse grid interpolants, constructs the optimal input sequence from a greedy search, and defines the associated optimal measurements using a scenario tree. We explore the optimality of this MBDOE algorithm using a 3-dimensional Hes1 model and a 19-dimensional T-cell receptor model. The 19-dimensional T-cell model also demonstrates the MBDOE algorithm's scalability to higher dimensions. In both cases, the dynamical uncertainty region that bounds the trajectories of the target system states were reduced by as much as 86% and 99% respectively after completing the designed experiments in silico. Our results suggest that for resolving dynamical uncertainty, the ability to design an input sequence paired with its associated measurements is particularly important when limited by the number of measurements.

  3. Optimization of Automotive Suspension System by Design of Experiments: A Nonderivative Method

    Directory of Open Access Journals (Sweden)

    Anirban C. Mitra

    2016-01-01

    Full Text Available A lot of health issues like low back pain, digestive disorders, and musculoskeletal disorders are caused as a result of the whole body vibrations induced by automobiles. This paper is concerned with the enhancement and optimization of suspension performance by using factorial methods of Design of Experiments, a nonderivative method. It focuses on the optimization of ride comfort and determining the parameters which affect the suspension behavior significantly as per the guidelines stated in ISO 2631-1:1997 standards. A quarter car test rig integrated with a LabVIEW based data acquisition system was developed to understand the real time behavior of a vehicle. In the pilot experiment, only three primary suspension parameters, that is, spring-stiffness, damping, and sprung mass, were considered and the full factorial method was implemented for the purpose of optimization. But the regression analysis of the data obtained rendered a very low goodness of fit which indicated that other parameters are likely to influence the response. Subsequently, steering geometry angles, camber and toe and tire pressure, were included in the design. Fractional factorial method with six factors was implemented to optimize ride comfort. The resultant optimum combination was then verified on the test rig with high correlation.

  4. Optimization of a chondrogenic medium through the use of factorial design of experiments.

    Science.gov (United States)

    Enochson, Lars; Brittberg, Mats; Lindahl, Anders

    2012-12-01

    The standard culture system for in vitro cartilage research is based on cells in a three-dimensional micromass culture and a defined medium containing the chondrogenic key growth factor, transforming growth factor (TGF)-β1. The aim of this study was to optimize the medium for chondrocyte micromass culture. Human chondrocytes were cultured in different media formulations, designed with a factorial design of experiments (DoE) approach and based on the standard medium for redifferentiation. The significant factors for the redifferentiation of the chondrocytes were determined and optimized in a two-step process through the use of response surface methodology. TGF-β1, dexamethasone, and glucose were significant factors for differentiating the chondrocytes. Compared to the standard medium, TGF-β1 was increased 30%, dexamethasone reduced 50%, and glucose increased 22%. The potency of the optimized medium was validated in a comparative study against the standard medium. The optimized medium resulted in micromass cultures with increased expression of genes important for the articular chondrocyte phenotype and in cultures with increased glycosaminoglycan/DNA content. Optimizing the standard medium with the efficient DoE method, a new medium that gave better redifferentiation for articular chondrocytes was determined.

  5. Optimal design of a smart post-buckled beam actuator using bat algorithm: simulations and experiments

    Science.gov (United States)

    Mallick, Rajnish; Ganguli, Ranjan; Kumar, Ravi

    2017-05-01

    The optimized design of a smart post-buckled beam actuator (PBA) is performed in this study. A smart material based piezoceramic stack actuator is used as a prime-mover to drive the buckled beam actuator. Piezoceramic actuators are high force, small displacement devices; they possess high energy density and have high bandwidth. In this study, bench top experiments are conducted to investigate the angular tip deflections due to the PBA. A new design of a linear-to-linear motion amplification device (LX-4) is developed to circumvent the small displacement handicap of piezoceramic stack actuators. LX-4 enhances the piezoceramic actuator mechanical leverage by a factor of four. The PBA model is based on dynamic elastic stability and is analyzed using the Mathieu-Hill equation. A formal optimization is carried out using a newly developed meta-heuristic nature inspired algorithm, named as the bat algorithm (BA). The BA utilizes the echolocation capability of bats. An optimized PBA in conjunction with LX-4 generates end rotations of the order of 15° at the output end. The optimized PBA design incurs less weight and induces large end rotations, which will be useful in development of various mechanical and aerospace devices, such as helicopter trailing edge flaps, micro and nano aerial vehicles and other robotic systems.

  6. Analysis of Photothermal Characterization of Layered Materials: Design of Optimal Experiments

    Science.gov (United States)

    Cole, Kevin D.

    2003-01-01

    In this paper numerical calculations are presented for the steady-periodic temperature in layered materials and functionally-graded materials to simulate photothermal methods for the measurement of thermal properties. No laboratory experiments were performed. The temperature is found from a new Green s function formulation which is particularly well-suited to machine calculation. The simulation method is verified by comparison with literature data for a layered material. The method is applied to a class of two-component functionally-graded materials and results for temperature and sensitivity coefficients are presented. An optimality criterion, based on the sensitivity coefficients, is used for choosing what experimental conditions will be needed for photothermal measurements to determine the spatial distribution of thermal properties. This method for optimal experiment design is completely general and may be applied to any photothermal technique and to any functionally-graded material.

  7. Optimal Design and Model Validation for Combustion Experiments in a Shock Tube

    KAUST Repository

    Long, Quan

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate functions. The control parameters are the initial hydrogen concentration and the temperature. First, we build a polynomial based surrogate model for the observable related to the reactions in the shock tube. Second, we use a novel MAP based approach to estimate the expected information gain in the proposed experiments and select the best experimental set-ups corresponding to the optimal expected information gains. Third, we use the synthetic data to carry out virtual validation of our methodology.

  8. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    Science.gov (United States)

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/

  9. Portfolio optimization using Mixture Design of Experiments. Scheduling trades within electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Francisco Alexandre de; Paiva, Anderson Paulo de; Lima, Jose Wanderley Marangon; Balestrassi, Pedro Paulo; Mendes, Rona Rinston Amaury [Federal Univ. of Itajuba, Minas Gerais (Brazil)

    2011-01-15

    Deregulation of the electricity sector has given rise to several approaches to defining optimal portfolios of energy contracts. Financial tools - requiring substantial adjustments - are usually used to determine risk and return. This article presents a novel approach to adjusting the conditional value at risk (CVaR) metric to the mix of contracts on the energy markets; the approach uses Mixture Design of Experiments (MDE). In this kind of experimental strategy, the design factors are treated as proportions in a mixture system considered quite adequate for treating portfolios in general. Instead of using traditional linear programming, the concept of desirability function is here used to combine the multi-response, nonlinear objective functions for mean with the variance of a specific portfolio obtained through MDE. The maximization of the desirability function is implied in the portfolio optimization, generating an efficient recruitment frontier. This approach offers three main contributions: it includes risk aversion in the optimization routine, it assesses interaction between contracts, and it lessens the computational effort required to solve the constrained nonlinear optimization problem. A case study based on the Brazilian energy market is used to illustrate the proposal. The numerical results verify the proposal's adequacy. (author)

  10. Cavitation optimization for a centrifugal pump impeller by using orthogonal design of experiment

    Science.gov (United States)

    Pei, Ji; Yin, Tingyun; Yuan, Shouqi; Wang, Wenjie; Wang, Jiabin

    2017-01-01

    Cavitation is one of the most important performance of centrifugal pumps. However, the current optimization works of centrifugal pump are mostly focusing on hydraulic efficiency only, which may result in poor cavitation performance. Therefore, it is necessary to find an appropriate solution to improve cavitation performance with acceptable efficiency. In this paper, to improve the cavitation performance of a centrifugal pump with a vaned diffuser, the influence of impeller geometric parameters on the cavitation of the pump is investigated using the orthogonal design of experiment (DOE) based on computational fluid dynamics. The impeller inlet diameter D 1, inlet incidence angle Δ β, and blade wrap angle φ are selected as the main impeller geometric parameters and the orthogonal experiment of L9(3*3) is performed. Three-dimensional steady simulations for cavitation are conducted by using constant gas mass fraction model with second-order upwind, and the predicated cavitation performance is validated by laboratory experiment. The optimization results are obtained by the range analysis method to improve cavitation performance without obvious decreasing the efficiency of the centrifugal pump. The internal flow of the pump is analyzed in order to identify the flow behavior that can affect cavitation performance. The results show that D 1 has the greatest influence on the pump cavitation and the final optimized impeller provides better flow distribution at blade leading edge. The final optimized impeller accomplishes better cavitation and hydraulic performance and the NPSHR decreases by 0.63m compared with the original one. The presented work supplies a feasible route in engineering practice to optimize a centrifugal pump impeller for better cavitation performance.

  11. Optimization of biomolecule separation by combining microscale filtration and design-of-experiment methods.

    Science.gov (United States)

    Kazemi, Amir S; Kawka, Karina; Latulippe, David R

    2016-10-01

    There is considerable interest in developing microscale (i.e., high-throughput) methods that enable multiple filtration experiments to be run in parallel with smaller sample amounts and thus reduce the overall required time and associated cost to run the filtration tests. Previous studies to date have focused on simply evaluating the filtration capacity, not the separation performance. In this work, the stirred-well filtration (SWF) method was used in combination with design-of-experiment (DOE) methods to optimize the separation performance for three binary mixtures of bio-molecules: protein-protein, protein-polysaccharide, and protein-DNA. Using the parallel based format of the SWF method, eight constant-flux ultrafiltration experiments were conducted at once to study the effects of stirring conditions, permeate flux, and/or solution conditions (pH, ionic strength). Four separate filtration tests were conducted for each combination of process variables; in total, over 100 separate tests were conducted. The sieving coefficient and selectivity results are presented to match the DOE design format and enable a greater understanding of the effects of the different process variables that were studied. The method described herein can be used to rapidly determine the optimal combination of process factors that give the best separation performance for a range of membrane-based separations applications and thus obviate the need to run a large number of traditional lab-scale tests. Biotechnol. Bioeng. 2016;113: 2131-2139. © 2016 Wiley Periodicals, Inc.

  12. Optimization of the media ingredients for cutinase production from Colleotrichum lindemuthianum using mixture design experiments.

    Science.gov (United States)

    Rispoli, Fred; Shah, Vishal

    2008-01-01

    Optimal concentrations of yeast extract, glucose and potassium phosphate in the fermentation medium have been identified for the maximum cutinase production from the fungi Colleotrichum lindemuthianum. A simplex lattice experimental design for mixtures was used to identify concentration ranges that yield maximum production. Three sets of experiments were performed all involving three ingredients. The sets of experiments differ in the number of concentration levels considered. In the first set we consider four concentration levels (i.e., 0%, 33%, 67%, 100%), and in the second and third sets we consider five and six levels, respectively. Results showed that the interactions among the nutrient ingredients are best captured when five- and six-level experiments are carried out. An algorithm has been proposed in this study to design the optimal medium. Various models were also developed to predict the enzyme production, and it is concluded that the cubic model obtained using six-level experimental data gives the best model. The study also highlighted the synergistic interaction between yeast extract and glucose toward cutinase production.

  13. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.

    Science.gov (United States)

    Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  14. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments

    Science.gov (United States)

    Hecht, Elizabeth S.; Oberg, Ann L.; Muddiman, David C.

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  15. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Institute of Scientific and Technical Information of China (English)

    Huang Jiangtao; Gao Zhenghong; Zhou Zhu; Zhao Ke

    2015-01-01

    Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is pro-posed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF) network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sam-pling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE) feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowd-ness enhance (RCE) adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic opti-mization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  16. Vehicle occupancy detection camera position optimization using design of experiments and standard image references

    Science.gov (United States)

    Paul, Peter; Hoover, Martin; Rabbani, Mojgan

    2013-03-01

    Camera positioning and orientation is important to applications in domains such as transportation since the objects to be imaged vary greatly in shape and size. In a typical transportation application that requires capturing still images, inductive loops buried in the ground or laser trigger sensors are used when a vehicle reaches the image capture zone to trigger the image capture system. The camera in such a system is in a fixed position pointed at the roadway and at a fixed orientation. Thus the problem is to determine the optimal location and orientation of the camera when capturing images from a wide variety of vehicles. Methods from Design for Six Sigma, including identifying important parameters and noise sources and performing systematically designed experiments (DOE) can be used to determine an effective set of parameter settings for the camera position and orientation under these conditions. In the transportation application of high occupancy vehicle lane enforcement, the number of passengers in the vehicle is to be counted. Past work has described front seat vehicle occupant counting using a camera mounted on an overhead gantry looking through the front windshield in order to capture images of vehicle occupants. However, viewing rear seat passengers is more problematic due to obstructions including the vehicle body frame structures and seats. One approach is to view the rear seats through the side window. In this situation the problem of optimally positioning and orienting the camera to adequately capture the rear seats through the side window can be addressed through a designed experiment. In any automated traffic enforcement system it is necessary for humans to be able to review any automatically captured digital imagery in order to verify detected infractions. Thus for defining an output to be optimized for the designed experiment, a human defined standard image reference (SIR) was used to quantify the quality of the line-of-sight to the rear seats of

  17. Process optimization by use of design of experiments: Application for liposomalization of FK506.

    Science.gov (United States)

    Toyota, Hiroyasu; Asai, Tomohiro; Oku, Naoto

    2017-03-07

    Design of experiments (DoE) can accelerate the optimization of drug formulations, especially complexed formulas such as those of drugs, using delivery systems. Administration of FK506 encapsulated in liposomes (FK506 liposomes) is an effective approach to treat acute stroke in animal studies. To provide FK506 liposomes as a brain protective agent, it is necessary to manufacture these liposomes with good reproducibility. The objective of this study was to confirm the usefulness of DoE for the process-optimization study of FK506 liposomes. The Box-Behnken design was used to evaluate the effect of the process parameters on the properties of FK506 liposomes. The results of multiple regression analysis showed that there was interaction between the hydration temperature and the freeze-thaw cycle on both the particle size and encapsulation efficiency. An increase in the PBS hydration volume resulted in an increase in encapsulation efficiency. Process parameters had no effect on the ζ-potential. The multiple regression equation showed good predictability of the particle size and the encapsulation efficiency. These results indicated that manufacturing conditions must be taken into consideration to prepare liposomes with desirable properties. DoE would thus be promising approach to optimize the conditions for the manufacturing of liposomes.

  18. Population Fisher information matrix and optimal design of discrete data responses in population pharmacodynamic experiments.

    Science.gov (United States)

    Ogungbenro, Kayode; Aarons, Leon

    2011-08-01

    In the recent years, interest in the application of experimental design theory to population pharmacokinetic (PK) and pharmacodynamic (PD) experiments has increased. The aim is to improve the efficiency and the precision with which parameters are estimated during data analysis and sometimes to increase the power and reduce the sample size required for hypothesis testing. The population Fisher information matrix (PFIM) has been described for uniresponse and multiresponse population PK experiments for design evaluation and optimisation. Despite these developments and availability of tools for optimal design of population PK and PD experiments much of the effort has been focused on repeated continuous variable measurements with less work being done on repeated discrete type measurements. Discrete data arise mainly in PDs e.g. ordinal, nominal, dichotomous or count measurements. This paper implements expressions for the PFIM for repeated ordinal, dichotomous and count measurements based on analysis by a mixed-effects modelling technique. Three simulation studies were used to investigate the performance of the expressions. Example 1 is based on repeated dichotomous measurements, Example 2 is based on repeated count measurements and Example 3 is based on repeated ordinal measurements. Data simulated in MATLAB were analysed using NONMEM (Laplace method) and the glmmML package in R (Laplace and adaptive Gauss-Hermite quadrature methods). The results obtained for Examples 1 and 2 showed good agreement between the relative standard errors obtained using the PFIM and simulations. The results obtained for Example 3 showed the importance of sampling at the most informative time points. Implementation of these expressions will provide the opportunity for efficient design of population PD experiments that involve discrete type data through design evaluation and optimisation.

  19. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    Science.gov (United States)

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  20. 'Unconventional' experiments in biology and medicine with optimized design based on quantum-like correlations.

    Science.gov (United States)

    Beauvais, Francis

    2017-02-01

    In previous articles, a description of 'unconventional' experiments (e.g. in vitro or clinical studies based on high dilutions, 'memory of water' or homeopathy) using quantum-like probability was proposed. Because the mathematical formulations of quantum logic are frequently an obstacle for physicians and biologists, a modified modeling that rests on classical probability is described in the present article. This modeling is inspired from a relational interpretation of quantum physics that applies not only to microscopic objects, but also to macroscopic structures, including experimental devices and observers. In this framework, any outcome of an experiment is not an absolute property of the observed system as usually considered but is expressed relatively to an observer. A team of interacting observers is thus described from an external view point based on two principles: the outcomes of experiments are expressed relatively to each observer and the observers agree on outcomes when they interact with each other. If probability fluctuations are also taken into account, correlations between 'expected' and observed outcomes emerge. Moreover, quantum-like correlations are predicted in experiments with local blind design but not with centralized blind design. No assumption on 'memory' or other physical modification of water is necessary in the present description although such hypotheses cannot be formally discarded. In conclusion, a simple modeling of 'unconventional' experiments based on classical probability is now available and its predictions can be tested. The underlying concepts are sufficiently intuitive to be spread into the homeopathy community and beyond. It is hoped that this modeling will encourage new studies with optimized designs for in vitro experiments and clinical trials. Copyright © 2017 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  1. Single and Multiresponse Adaptive Design of Experiments with Application to Design Optimization of Novel Heat Exchangers

    Science.gov (United States)

    2009-01-01

    annealing ( Serafini , 1992; Nam and Park, 2000) and Timmel’s population based method (Timmel, 1980; Shukla et al., 2005) to name a few. Several variations...Exchangers, British Columbia, Canada. 93. Serafini , P., 1992, “Simulated Annealing for Multiple Objective Optimization Problems,” Proceedings of the

  2. Optimal fed batch experiment design for estimation of monod kinetics of Azospirillum brasilense: from theory to practice.

    Science.gov (United States)

    Cappuyns, Astrid M; Bernaerts, Kristel; Smets, Ilse Y; Ona, Ositadinma; Prinsen, Els; Vanderleyden, Jos; Van Impe, Jan F

    2007-01-01

    In this paper the problem of reliable and accurate parameter estimation for unstructured models is considered. It is illustrated how a theoretically optimal design can be successfully translated into a practically feasible, robust, and informative experiment. The well-known parameter estimation problem of Monod kinetic parameters is used as a vehicle to illustrate our approach. As known for a long time, noisy batch measurements do not allow for unique and accurate estimation of the kinetic parameters of the Monod model. Techniques of optimal experiment design are, therefore, exploited to design informative experiments and to improve the parameter estimation accuracy. During the design process, practical feasibility has to be kept in mind. The designed experiments are easy to implement in practice and do not require additional monitoring equipment. Both design and experimental validation of informative fed batch experiments are illustrated with a case study, namely, the growth of the nitrogen-fixing bacteria Azospirillum brasilense.

  3. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui

    2010-09-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  4. Recent experience with multidisciplinary analysis and optimization in advanced aircraft design

    Science.gov (United States)

    Dollyhigh, Samuel M.; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    The task of modern aircraft design has always been complicated due to the number of intertwined technical factors from the various engineering disciplines. Furthermore, this complexity has been rapidly increasing by the development of such technologies as aeroelasticity tailored materials and structures, active control systems, integrated propulsion/airframe controls, thrust vectoring, and so on. Successful designs that achieve maximum advantage from these new technologies require a thorough understanding of the physical phenomena and the interactions among these phenomena. A study commissioned by the Aeronautical Sciences and Evaluation Board of the National Research Council has gone so far as to identify technology integration as a new discipline from which many future aeronautical advancements will arise. Regardless of whether one considers integration as a new discipline or not, it is clear to all engineers involved in aircraft design and analysis that better methods are required. In the past, designers conducted parametric studies in which a relatively small number of principal characteristics were varied to determine the effect on design requirements which were themselves often diverse and contradictory. Once a design was chosen, it then passed through the various engineers' disciplines whose principal task was to make the chosen design workable. Working in a limited design space, the discipline expert sometimes improved the concept, but more often than not, the result was in the form of a penalty to make the original concept workable. If an insurmountable problem was encountered, the process began over. Most design systems that attempt to account for disciplinary interactions have large empirical elements and reliance on past experience is a poor guide in obtaining maximum utilizations of new technologies. Further compounding the difficulty of design is that as the aeronautical sciences have matured, the discipline specialist's area of research has generally

  5. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations.

    Science.gov (United States)

    Duarte, Belmiro P M; Wong, Weng Kee; Oliveira, Nuno M C

    2016-02-15

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D-, A- and E-optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D-optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice.

  6. DESIGN AND DEVELOPMENT OF ONDANSETRON ORALLY DISINTEGRATING TABLETS AND ITS OPTIMIZATION USING DESIGN OF EXPERIMENT

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar Bhasin et al.

    2012-03-01

    Full Text Available Ondansetron is the first of a new class of drugs, selective serotonin receptor antagonist (5 hydroxy tryptamine type 3 used as an anti emetic associated with cancer chemotherapy. Its Orally Disintegrating Tablet has been developed for patients who find swallowing difficult by freeze dried technology by RP Scherer Corporation and Scherer DDS. The aim of this study was to design a new orally disintegrating tablet that has high hardness and a fast disintegration rate using conventional tablet technology. Ondansetron ODT was prepared by using traditional technology like direct compression and wet granulation technique. As blend exhibited poor flow in direct compression process, so wet granulation process was finalized. Bitter taste of Ondansetron has been masked by use of sweetener like aspartame and peppermint flavor. Quick disintegration has been achieved by use of surfactant in the granulating solvent and superdisintegrant like crospovidone in both intra and extragranular part. Design space has been created by use of different concentrations of both binders as well as disintegrant with the help of DOE and a robust formulation has been made. In vitro release profile of both formulations prepared by freeze drying and wet granulation is matching. Formulation prepared by wet granulation process has been found acceptable to volunteers in term of taste, mouth feel and convenience of administration.

  7. Systematic optimization of human pluripotent stem cells media using Design of Experiments

    Science.gov (United States)

    Marinho, Paulo A.; Chailangkarn, Thanathom; Muotri, Alysson R.

    2015-05-01

    Human pluripotent stem cells (hPSC) are used to study the early stages of human development in vitro and, increasingly due to somatic cell reprogramming, cellular and molecular mechanisms of disease. Cell culture medium is a critical factor for hPSC to maintain pluripotency and self-renewal. Numerous defined culture media have been empirically developed but never systematically optimized for culturing hPSC. We applied design of experiments (DOE), a powerful statistical tool, to improve the medium formulation for hPSC. Using pluripotency and cell growth as read-outs, we determined the optimal concentration of both basic fibroblast growth factor (bFGF) and neuregulin-1 beta 1 (NRG1β1). The resulting formulation, named iDEAL, improved the maintenance and passage of hPSC in both normal and stressful conditions, and affected trimethylated histone 3 lysine 27 (H3K27me3) epigenetic status after genetic reprogramming. It also enhances efficient hPSC plating as single cells. Altogether, iDEAL potentially allows scalable and controllable hPSC culture routine in translational research. Our DOE strategy could also be applied to hPSC differentiation protocols, which often require numerous and complex cell culture media.

  8. Wear Performance Optimization of Electroless Ni-B Coating Using Taguchi Design of Experiments

    Directory of Open Access Journals (Sweden)

    S. K. DAS

    2010-12-01

    Full Text Available The present study outlines the use of Taguchi parameter design to minimize the wear performance of electroless Ni-B coating by optimizing the tribological testing parameters. The tests are carried out in a multi- tribotester and the three parameters viz. load (L, speed (S and time (T are considered with three levels each. An L27 array is used to accommodate the three factors as well as their interaction effects. The Taguchi experiments gave the optimal combination of parameters L1S2T1 (50 N for load, 60 rpm for speed and 5 minute for time. Furthermore, a statistical analysis of variance reveals that both load and time have significant influence over the wear behavior of electroless coating. Also the interaction between load and speed and that between load and time influence wear quite significantly. The coating is characterized using scanning electron microscopy, energy dispersive X-ray analysis and X-ray diffraction analysis. The wear mechanism is also studied and found to be abrasive in nature.

  9. Surrogate models and optimal design of experiments for chemical kinetics applications

    KAUST Repository

    Bisetti, Fabrizio

    2015-01-07

    Kinetic models for reactive flow applications comprise hundreds of reactions describing the complex interaction among many chemical species. The detailed knowledge of the reaction parameters is a key component of the design cycle of next-generation combustion devices, which aim at improving conversion efficiency and reducing pollutant emissions. Shock tubes are a laboratory scale experimental configuration, which is widely used for the study of reaction rate parameters. Important uncertainties exist in the values of the thousands of parameters included in the most advanced kinetic models. This talk discusses the application of uncertainty quantification (UQ) methods to the analysis of shock tube data as well as the design of shock tube experiments. Attention is focused on a spectral framework in which uncertain inputs are parameterized in terms of canonical random variables, and quantities of interest (QoIs) are expressed in terms of a mean-square convergent series of orthogonal polynomials acting on these variables. We outline the implementation of a recent spectral collocation approach for determining the unknown coefficients of the expansion, namely using a sparse, adaptive pseudo-spectral construction that enables us to obtain surrogates for the QoIs accurately and efficiently. We first discuss the utility of the resulting expressions in quantifying the sensitivity of QoIs to uncertain inputs, and in the Bayesian inference key physical parameters from experimental measurements. We then discuss the application of these techniques to the analysis of shock-tube data and the optimal design of shock-tube experiments for two key reactions in combustion kinetics: the chain-brancing reaction H + O2 ←→ OH + O and the reaction of Furans with the hydroxyl radical OH.

  10. Real-time 2D spatially selective MRI experiments: Comparative analysis of optimal control design methods.

    Science.gov (United States)

    Maximov, Ivan I; Vinding, Mads S; Tse, Desmond H Y; Nielsen, Niels Chr; Shah, N Jon

    2015-05-01

    There is an increasing need for development of advanced radio-frequency (RF) pulse techniques in modern magnetic resonance imaging (MRI) systems driven by recent advancements in ultra-high magnetic field systems, new parallel transmit/receive coil designs, and accessible powerful computational facilities. 2D spatially selective RF pulses are an example of advanced pulses that have many applications of clinical relevance, e.g., reduced field of view imaging, and MR spectroscopy. The 2D spatially selective RF pulses are mostly generated and optimised with numerical methods that can handle vast controls and multiple constraints. With this study we aim at demonstrating that numerical, optimal control (OC) algorithms are efficient for the design of 2D spatially selective MRI experiments, when robustness towards e.g. field inhomogeneity is in focus. We have chosen three popular OC algorithms; two which are gradient-based, concurrent methods using first- and second-order derivatives, respectively; and a third that belongs to the sequential, monotonically convergent family. We used two experimental models: a water phantom, and an in vivo human head. Taking into consideration the challenging experimental setup, our analysis suggests the use of the sequential, monotonic approach and the second-order gradient-based approach as computational speed, experimental robustness, and image quality is key. All algorithms used in this work were implemented in the MATLAB environment and are freely available to the MRI community.

  11. Optimization of the mechanism from equipment for overcoming obstacles with design of experiments

    Directory of Open Access Journals (Sweden)

    Gheorghiță Vlad

    2017-01-01

    Full Text Available A data collection considers how the experimental factors, controlled and uncontrolled, fit together into a model that will meet the specific objectives of an experiment and satisfy the practical constraints of resource and time. Customizing any product to improve performance requires a reliable method to measure performance results, in order to accurately assess the effects that design changes have on them. This paper has the purpose to analyse the equipment for overcoming obstacles. A comprehensive approach to model equipment requires performance data related to the design parameters. Slipping, slanting, becoming unstable, sliding, overturning and blocking while climbing steps and stairs are common problems and may cause instability for the user of the equipment. The Taguchi Method is used to optimize several parameters of the proposed mechanism and to ensure the stability through horizontality of the equipment. This method provides an opportunity to reduce the number of required sample size and is implemented by an orthogonal experimental array which enables the effect of multiple parameters to be simultaneously assessed.

  12. Augmented design and analysis of computer experiments: a novel tolerance embedded global optimization approach applied to SWIR hyperspectral illumination design.

    Science.gov (United States)

    Keresztes, Janos C; John Koshel, R; D'huys, Karlien; De Ketelaere, Bart; Audenaert, Jan; Goos, Peter; Saeys, Wouter

    2016-12-26

    A novel meta-heuristic approach for minimizing nonlinear constrained problems is proposed, which offers tolerance information during the search for the global optimum. The method is based on the concept of design and analysis of computer experiments combined with a novel two phase design augmentation (DACEDA), which models the entire merit space using a Gaussian process, with iteratively increased resolution around the optimum. The algorithm is introduced through a series of cases studies with increasing complexity for optimizing uniformity of a short-wave infrared (SWIR) hyperspectral imaging (HSI) illumination system (IS). The method is first demonstrated for a two-dimensional problem consisting of the positioning of analytical isotropic point sources. The method is further applied to two-dimensional (2D) and five-dimensional (5D) SWIR HSI IS versions using close- and far-field measured source models applied within the non-sequential ray-tracing software FRED, including inherent stochastic noise. The proposed method is compared to other heuristic approaches such as simplex and simulated annealing (SA). It is shown that DACEDA converges towards a minimum with 1 % improvement compared to simplex and SA, and more importantly requiring only half the number of simulations. Finally, a concurrent tolerance analysis is done within DACEDA for to the five-dimensional case such that further simulations are not required.

  13. Optimal de novo design of MRM experiments for rapid assay development in targeted proteomics.

    Science.gov (United States)

    Bertsch, Andreas; Jung, Stephan; Zerck, Alexandra; Pfeifer, Nico; Nahnsen, Sven; Henneges, Carsten; Nordheim, Alfred; Kohlbacher, Oliver

    2010-05-07

    Targeted proteomic approaches such as multiple reaction monitoring (MRM) overcome problems associated with classical shotgun mass spectrometry experiments. Developing MRM quantitation assays can be time consuming, because relevant peptide representatives of the proteins must be found and their retention time and the product ions must be determined. Given the transitions, hundreds to thousands of them can be scheduled into one experiment run. However, it is difficult to select which of the transitions should be included into a measurement. We present a novel algorithm that allows the construction of MRM assays from the sequence of the targeted proteins alone. This enables the rapid development of targeted MRM experiments without large libraries of transitions or peptide spectra. The approach relies on combinatorial optimization in combination with machine learning techniques to predict proteotypicity, retention time, and fragmentation of peptides. The resulting potential transitions are scheduled optimally by solving an integer linear program. We demonstrate that fully automated construction of MRM experiments from protein sequences alone is possible and over 80% coverage of the targeted proteins can be achieved without further optimization of the assay.

  14. Optimal Experience of Web Activities.

    Science.gov (United States)

    Chen, Hsiang; Wigand, R. T.; Nilan, M. S.

    1999-01-01

    Reports on Web users' optimal flow experiences to examine positive aspects of Web experiences that could be linked to theory applied to other media and then incorporated into Web design. Discusses the use of content-analytic procedures to analyze open-ended questionnaires that examined Web users' perceived flow experiences. (Author/LRW)

  15. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    Science.gov (United States)

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  16. Competitive Comparison of Optimal Designs of Experiments for Sampling-based Sensitivity Analysis

    CERN Document Server

    Janouchova, Eliska

    2012-01-01

    Nowadays, the numerical models of real-world structures are more precise, more complex and, of course, more time-consuming. Despite the growth of a computational effort, the exploration of model behaviour remains a complex task. The sensitivity analysis is a basic tool for investigating the sensitivity of the model to its inputs. One widely used strategy to assess the sensitivity is based on a finite set of simulations for a given sets of input parameters, i.e. points in the design space. An estimate of the sensitivity can be then obtained by computing correlations between the input parameters and the chosen response of the model. The accuracy of the sensitivity prediction depends on the choice of design points called the design of experiments. The aim of the presented paper is to review and compare available criteria determining the quality of the design of experiments suitable for sampling-based sensitivity analysis.

  17. Optimal mixture experiments

    CERN Document Server

    Sinha, B K; Pal, Manisha; Das, P

    2014-01-01

    The book dwells mainly on the optimality aspects of mixture designs. As mixture models are a special case of regression models, a general discussion on regression designs has been presented, which includes topics like continuous designs, de la Garza phenomenon, Loewner order domination, Equivalence theorems for different optimality criteria and standard optimality results for single variable polynomial regression and multivariate linear and quadratic regression models. This is followed by a review of the available literature on estimation of parameters in mixture models. Based on recent research findings, the volume also introduces optimal mixture designs for estimation of optimum mixing proportions in different mixture models, which include Scheffé’s quadratic model, Darroch-Waller model, log- contrast model, mixture-amount models, random coefficient models and multi-response model.  Robust mixture designs and mixture designs in blocks have been also reviewed. Moreover, some applications of mixture desig...

  18. Time-saving design of experiment protocol for optimization of LC-MS data processing in metabolomic approaches.

    Science.gov (United States)

    Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine

    2013-08-06

    We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.

  19. A statistical experiment design approach for optimizing biodegradation of weathered crude oil in coastal sediments.

    Science.gov (United States)

    Mohajeri, Leila; Aziz, Hamidi Abdul; Isa, Mohamed Hasnain; Zahed, Mohammad Ali

    2010-02-01

    This work studied the bioremediation of weathered crude oil (WCO) in coastal sediment samples using central composite face centered design (CCFD) under response surface methodology (RSM). Initial oil concentration, biomass, nitrogen and phosphorus concentrations were used as independent variables (factors) and oil removal as dependent variable (response) in a 60 days trial. A statistically significant model for WCO removal was obtained. The coefficient of determination (R(2)=0.9732) and probability value (P<0.0001) demonstrated significance for the regression model. Numerical optimization based on desirability function were carried out for initial oil concentration of 2, 16 and 30 g per kg sediment and 83.13, 78.06 and 69.92 per cent removal were observed respectively, compare to 77.13, 74.17 and 69.87 per cent removal for un-optimized results.

  20. Optimization of Pb(II) biosorption by Robinia tree leaves using statistical design of experiments.

    Science.gov (United States)

    Zolgharnein, Javad; Shahmoradi, Ali; Sangi, Mohammad Reza

    2008-07-30

    The present study introduces Robinia tree leaves as a novel and efficient biosorbent for removing Pb(II) from aqueous solutions. In order to reduce the large number of experiments and find the highest removal efficiency of Pb(II), a set of full 2(3) factorial design with two blocks were performed in duplicate (16 experiments). In all experiments, the contact time was fixed at 25 min. The main interaction effects of the three factors including sorbent mass, pH and initial concentration of metal-ion were considered. By using Student's t-test and analysis of variances (ANOVA), the main factors, which had the highest effect on the removal process, were identified. Twenty-six experiments were designed according to Doehlert response surface design to obtain a mathematical model describing functional relationship between response and main independent variables. The most suitable regression model, that fitted the experimental data extremely well, was chosen according to the lack-of-fit-test and adjusted R(2) value. Finally, after checking for possible outliers, the optimum conditions for maximum removal of Pb(II) from aqueous solution were obtained. The best conditions were calculated to be as: initial concentration of Pb(II)=40 mg L(-1), pH 4.6 and concentration of sorbet equal to 27.3 g L(-1).

  1. Optimized design and analysis of sparse-sampling fMRI experiments

    Directory of Open Access Journals (Sweden)

    Tyler K Perrachione

    2013-04-01

    Full Text Available Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI, in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional timeseries. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR delay (an acquisition parameter, stimulation rate (an experimental design parameter and model basis function (an analysis parameter act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1 Sparse analyses should utilize a physiologically-informed model that incorporates hemodynamic response convolution to reduce model error. (2 The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3 TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to improve

  2. Optimization of process parameters for synthesis of silica–Ni nanocomposite by design of experiment

    Indian Academy of Sciences (India)

    A K Pramanick; M K Mitra; S Mukherjee; G C Das; B Duari

    2013-12-01

    The optimumcombination of experimental variable, temperature, time of heat treatment under nitrogen atmosphere and amount of Ni-salt was delineated to find out the maximum yield of nanophase Ni in the silica gel matrix. The size of Ni in the silica gel was found to be 34 and 45 nm for the two chosen compositions, respectively. A statistically adequate regression equation, within 95% confidence limit was developed by carrying out a set of active experiments within the framework of design of experiment. The regression equation is found to indicate the beneficial role of temperature and time of heat treatment.

  3. Application of multi-factorial design of experiments to successfully optimize immunoassays for robust measurements of therapeutic proteins.

    Science.gov (United States)

    Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh

    2009-02-20

    Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method

  4. The Lead-Free Solder Selection Method and Process Optimization Based on Design of Experiment

    Directory of Open Access Journals (Sweden)

    Wang Bing

    2013-07-01

    Full Text Available In the study, through researching the characteristic of the lead-free solder, we introduce the method of QFD (Quality Function Deployment to transform the demand of production properties and process into the technical demand of the lead-free solder, thus we could transform the demand concept of sampling into a concrete performance index. Finally we can obtain two parameters of the technological competitive power index and market competitive power index to evaluate performance of the lead-free solder through making a series of experiments. We utilize the design of experiment method to find out key parameter of process and the best collocation of parameter, which make the co planarity of tin ball descend to 149 from 178 and promote the process’s ability up to 95.2 from 85%.

  5. MULTIDISCIPLINARY ROBUST OPTIMIZATION DESIGN

    Institute of Scientific and Technical Information of China (English)

    Chen Jianjiang; Xiao Renbin; Zhong Yifang; Dou Gang

    2005-01-01

    Because uncertainty factors inevitably exist under multidisciplinary design environment, a hierarchical multidisciplinary robust optimization design based on response surface is proposed. The method constructs optimization model of subsystem level and system level to coordinate the coupling among subsystems, and also the response surface based on the artificial neural network is introduced to provide information for system level optimization tool to maintain the independence of subsystems,i.e. to realize multidisciplinary parallel design. The application case of electrical packaging demonstrates that reasonable robust optimum solution can be yielded and it is a potential and efficient multidisciplinary robust optimization approach.

  6. Optimizing and Improving the Growth Quality of ZnO Nanowire Arrays Guided by Statistical Design of Experiments.

    Science.gov (United States)

    Xu, Sheng; Adiga, Nagesh; Ba, Shan; Dasgupta, Tirthankar; Wu, C F Jeff; Wang, Zhong Lin

    2009-07-28

    Controlling the morphology of the as-synthesized nanostructures is usually challenging, and there lacks of a general theoretical guidance in experimental approach. In this study, a novel way of optimizing the aspect ratio of hydrothermally grown ZnO nanowire (NW) arrays is presented by utilizing a systematic statistical design and analysis method. In this work, we use pick-the-winner rule and one-pair-at-a-time main effect analysis to sequentially design the experiments and identify optimal reaction settings. By controlling the hydrothermal reaction parameters (reaction temperature, time, precursor concentration, and capping agent), we improved the aspect ratio of ZnO NWs from around 10 to nearly 23. The effect of noise on the experimental results was identified and successfully reduced, and the statistical design and analysis methods were very effective in reducing the number of experiments performed and in identifying the optimal experimental settings. In addition, the antireflection spectrum of the as-synthesized ZnO NWs clearly shows that higher aspect ratio of the ZnO NW arrays leads to about 30% stronger suppression in the UV-vis range emission. This shows great potential applications as antireflective coating layers in photovoltaic devices.

  7. DEM合成实验的优化设计%Optimal design of synthesis experiments DEM

    Institute of Scientific and Technical Information of China (English)

    王欣

    2012-01-01

    Orthogonal experimental design is a important mathematics method to study many factors experiment.At present,it is already widely applied in many field such as chemical industry,the rubber,textile,radio,the medical and health.In according to test of forefathers,the producing rate of DEM is influenced by responsing time,proportion of catalyst and raw materials matching.The Orthogonal form of three factors-three ladders was used to design the project of composing DEM,and 9 groups of experiments have been done(per group contains 2 to 3 parallel experiment).The result of experiments and the analytical method of the Orthogonal experimental design have been used to analysed the influence of the three factors above-mentioned,and Main factor and the best response condition have been confirmed.%正交实验法就是一种研究多因子实验问题的重要数学方法,目前已在冶金,化工,橡胶,纺织,无线电,医药卫生等方面得到了广泛的应用。根据前人的试验总结出影响DEM合成产率的因素主要有反应时间,催化剂比例和原料配比。本文采用三因素三阶梯的正交表设计了的DEM合成实验方案,并根据该方案进行了9组实验(每组均包含2-3次平行实验)。由实验结果并根据正交实验的统计分析方法,分析了上述三因素对DEM产品含量的影响,确定了主要影响因素,并得到了最佳反应条件。

  8. Optimal Bayesian experimental design for priors of compact support with application to shock-tube experiments for combustion kinetics

    KAUST Repository

    Bisetti, Fabrizio

    2016-01-12

    The analysis of reactive systems in combustion science and technology relies on detailed models comprising many chemical reactions that describe the conversion of fuel and oxidizer into products and the formation of pollutants. Shock-tube experiments are a convenient setting for measuring the rate parameters of individual reactions. The temperature, pressure, and concentration of reactants are chosen to maximize the sensitivity of the measured quantities to the rate parameter of the target reaction. In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target

  9. Optimal design of measurement settings for quantum-state-tomography experiments

    Science.gov (United States)

    Li, Jun; Huang, Shilin; Luo, Zhihuang; Li, Keren; Lu, Dawei; Zeng, Bei

    2017-09-01

    Quantum state tomography is an indispensable but costly part of many quantum experiments. Typically, it requires measurements to be carried out in a number of different settings on a fixed experimental setup. The collected data are often informationally overcomplete, with the amount of information redundancy depending on the particular set of measurement settings chosen. This raises a question about how one should optimally take data so that the number of measurement settings necessary can be reduced. Here, we cast this problem in terms of integer programming. For a given experimental setup, standard integer-programming algorithms allow us to find the minimum set of readout operations that can realize a target tomographic task. We apply the method to certain basic and practical state-tomographic problems in nuclear-magnetic-resonance experimental systems. The results show that considerably fewer readout operations can be found using our technique than by using the previous greedy search strategy. Therefore, our method could be helpful for simplifying measurement schemes to minimize the experimental effort.

  10. Experiment and optimal design of a collection device for a residual plastic film baler

    Directory of Open Access Journals (Sweden)

    Qi NIU,Xuegeng CHEN,Chao JI,Jie WU

    2015-12-01

    Full Text Available It is imperative to carry out research on residual plastic film collection technology to solve the serious problem of farmland pollution. The residual plastic film baler was designed as a package for film strip collection, cleaning and baling. The collection device is a core component of the baler. Response surface analysis was used in this study to optimize the structure and working parameters for improving the collection efficiency of residual film and the impurity of film package. The results show that the factors affecting the collection rate of residual film and the impurity of the film package are the speed ratio (k between the trash removal roller and eccentric collection mechanism, the number (z and the mounting angle (θ of spring teeth in the same revolution plane. For the collection rate, the importance of the three factors are in the order, k>z>θ. Meanwhile, for the impurity, the importance of three factors are in the order, z>k>θ. When the speed ratio, the mounting angle and the number of spring teeth was set at 1.6º, 45º, and 8º, respectively, the collection rate of residual film was 88.9% and the impurity of residual film package was 14.2% for the baler.

  11. Optimization of parameters affecting signal intensity in an LTQ-orbitrap in negative ion mode: A design of experiments approach.

    Science.gov (United States)

    Lemonakis, Nikolaos; Skaltsounis, Alexios-Leandros; Tsarbopoulos, Anthony; Gikas, Evagelos

    2016-01-15

    A multistage optimization of all the parameters affecting detection/response in an LTQ-orbitrap analyzer was performed, using a design of experiments methodology. The signal intensity, a critical issue for mass analysis, was investigated and the optimization process was completed in three successive steps, taking into account the three main regions of an orbitrap, the ion generation, the ion transmission and the ion detection regions. Oleuropein and hydroxytyrosol were selected as the model compounds. Overall, applying this methodology the sensitivity was increased more than 24%, the resolution more than 6.5%, whereas the elapsed scan time was reduced nearly to its half. A high-resolution LTQ Orbitrap Discovery mass spectrometer was used for the determination of the analytes of interest. Thus, oleuropein and hydroxytyrosol were infused via the instruments syringe pump and they were analyzed employing electrospray ionization (ESI) in the negative high-resolution full-scan ion mode. The parameters of the three main regions of the LTQ-orbitrap were independently optimized in terms of maximum sensitivity. In this context, factorial design, response surface model and Plackett-Burman experiments were performed and analysis of variance was carried out to evaluate the validity of the statistical model and to determine the most significant parameters for signal intensity. The optimum MS conditions for each analyte were summarized and the method optimum condition was achieved by maximizing the desirability function. Our observation showed good agreement between the predicted optimum response and the responses collected at the predicted optimum conditions.

  12. Experiments Planning, Analysis, and Optimization

    CERN Document Server

    Wu, C F Jeff

    2011-01-01

    Praise for the First Edition: "If you . . . want an up-to-date, definitive reference written by authors who have contributed much to this field, then this book is an essential addition to your library."-Journal of the American Statistical Association Fully updated to reflect the major progress in the use of statistically designed experiments for product and process improvement, Experiments, Second Edition introduces some of the newest discoveries-and sheds further light on existing ones-on the design and analysis of experiments and their applications in system optimization, robustness, and tre

  13. Application of the design of experiments in optimization of drug layering of pellets with an insight into drug polymer interactions.

    Science.gov (United States)

    Kovacevic, Jovana; Ibric, Svetlana; Djuris, Jelena; Kleinebudde, Peter

    2016-06-15

    This study consists of two experimental designs. Within the first one, suitable technique for application of model drug onto inactive pellets was evaluated and formulation and process parameters with greatest impact to process efficency and useful yield were determined. Results of experiments showed that formulation characteristics were the ones with the greatest impact on coating efficiency and that suspension layering technique was significantly better for drug application onto inactive pellets in comparison to solution layering during which pronounced agglomeration of pellets occurred. Analysis of drug-polymer interactions by differential scanning calorimetry was performed to explain the results of experiments. The reason for agglomeration of pellets during solution layering was formation of low Tg amorphous form of model drug. The second set of experiments was performed according to central composite design experimental plan in order to optimize level of binder and concentration of solids in the coating liquid which were found to have greatest positive impact on process efficiency and useful yield in the screening study. Statistically significant models were obtained by response surface methodology and it was possible to use them to define optimal levels of excipients in the formulation.

  14. Optimal design of metabolic flux analysis experiments for anchorage-dependent mammalian cells using a cellular automaton model.

    Science.gov (United States)

    Meadows, Adam L; Roy, Siddhartha; Clark, Douglas S; Blanch, Harvey W

    2007-09-01

    Metabolic flux analysis (MFA) is widely used to quantify metabolic pathway activity. Typical applications involve isotopically labeled substrates, which require both metabolic and isotopic steady states for simplified data analysis. For bacterial systems, these steady states are readily achieved in chemostat cultures. However, mammalian cells are often anchorage dependent and experiments are typically conducted in batch or fed-batch systems, such as tissue culture dishes or microcarrier-containing bioreactors. Surface adherence may cause deviations from exponential growth, resulting in metabolically heterogeneous populations and a varying number of cellular "nearest neighbors" that may affect the observed metabolism. Here, we discuss different growth models suitable for deconvoluting these effects and their application to the design and optimization of MFA experiments employing surface-adherent mammalian cells. We describe a stochastic two-dimensional (2D) cellular automaton model, with empirical descriptions of cell number and non-growing cell fraction, suitable for easy application to most anchorage-dependent mammalian cell cultures. Model utility was verified by studying the impact of contact inhibition on the growth rate, specific extracellular flux rates, and isotopic labeling in lactate for MCF7 cells, a commonly studied breast cancer cell line. The model successfully defined the time over which exponential growth and a metabolically homogeneous growing cell population could be assumed. The cellular automaton model developed is shown to be a useful tool in designing optimal MFA experiments.

  15. Optimizing the Machining Parameters for Minimum Surface Roughness in Turning of GFRP Composites Using Design of Experiments

    Institute of Scientific and Technical Information of China (English)

    K. Palanikumar; L.Karunamoorthy; R.Karthikeyan

    2004-01-01

    In recent years, glass fiber reinforced plastics (GFRP) are being extensively used in variety of engineering applications in many different fields such as aerospace, oil, gas and process industries. However, the users of FRP are facing difficulties to machine it, because of fiber delamination, fiber pull out, short tool life, matrix debonding, burning and formation of powder like chips. The present investigation focuses on the optimization of machining parameters for surface roughness of glass fiber reinforced plastics (GFRP) using design of experiments (DoE). The machining parameters considered were speed, feed, depth of cut and workpiece (fiber orientation). An attempt was made to analyse the influence of factors and their interactions during machining. The results of the present study gives the optimal combination of machining parameters and this will help to improve the machining requirements of GFRP composites.

  16. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    Science.gov (United States)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA

  17. Formulation, characterization and optimization of valsartan self-microemulsifying drug delivery system using statistical design of experiment.

    Science.gov (United States)

    Poudel, Bijay Kumar; Marasini, Nirmal; Tran, Tuan Hiep; Choi, Han-Gon; Yong, Chul Soon; Kim, Jong Oh

    2012-01-01

    The aim of the present research was to systematically investigate the main, interaction and the quadratic effects of formulation variables on the performance of self-microemulsifying drug delivery system (SMEDDS) of valsartan using design of experiment. A 17-run Box-Behnken design (BBD) with 3-factors and 3-levels, including 5 replicates at the centre point, was used for fitting a 2nd-order response surface. After the preliminary screening, Labrafil M 2125 CS as oil, Tween 20 as surfactant and Capryol 90 as co-surfactant were taken as independent variables. The dependent factors (responses) were particle size, polydispersity index (PDI), dissolution after 15 min and equilibrium solubility. Coefficients were estimated by regression analysis and the model adequacy was checked by an F-test and the determination coefficient (R(2)). All the responses were optimized simultaneously by using desirability function. Our results demonstrated marked main and interaction effects of independent factors on responses. The optimized formulation consisted of 26.8% (w/w) oil, 60.1% (w/w) surfactant and 13.1% (w/w) co-surfactant, and showed average micelle size of 90.7 nm and 0.246 PDI, 91.2% dissolution after 15 min and 226.7 mg/g equilibrium solubility. For the optimized formulation, predicted value and experimental value were in close agreement. After oral administration, the optimized formulation gave more than 2-fold higher area under curve (AUC) and about 6-fold higher C(max) in rats than valsartan powder (p<0.05). The BBD facilitated in the better understanding of inherent relationship of formulation variables with the responses and in the optimization of valsartan SMEDDS in relatively time and labor effective manner.

  18. Development and Optimization of Polymeric Self-Emulsifying Nanocapsules for Localized Drug Delivery: Design of Experiment Approach

    Directory of Open Access Journals (Sweden)

    Jyoti Wadhwa

    2014-01-01

    Full Text Available The purpose of the present study was to formulate polymeric self-emulsifying curcumin nanocapsules with high encapsulation efficiency, good emulsification ability, and optimal globule size for localized targeting in the colon. Formulations were prepared using modified quasiemulsion solvent diffusion method. Concentration of formulation variables, namely, X1 (oil, X2 (polymeric emulsifier, and X3 (adsorbent, was optimized by design of experiments using Box-Behnken design, for its impact on mean globule size (Y1 and encapsulation efficiency (Y2 of the formulation. Polymeric nanocapsules with an average diameter of 100–180 nm and an encapsulation efficiency of 64.85 ± 0.12% were obtained. In vitro studies revealed that formulations released the drug after 5 h lag time corresponding to the time to reach the colonic region. Pronounced localized action was inferred from the plasma concentration profile (Cmax 200 ng/mL that depicts limited systemic absorption. Roentgenography study confirms the localized presence of carrier (0–2 h in upper GIT; 2–4 h in small intestine; and 4–24 h in the lower intestine. Optimized formulation showed significantly higher cytotoxicity (IC50 value 20.32 μM in HT 29 colonic cancer cell line. The present study demonstrates systematic development of polymeric self-emulsifying nanocapsule formulation of curcumin for localized targeting in colon.

  19. DS-OPTIMAL DESIGNS FOR STUDYING COMBINATIONS OF CHEMICALS USING MULTIPLE FIXED-RATIO RAY EXPERIMENTS

    Science.gov (United States)

    ABSTRACT Detecting and characterizing interactions among chemicals is an important environmental issue. Traditional factorial designs become infeasible as the number of compounds under study increases. Ray designs, which reduce the amount of experimental effort, can be...

  20. ATHENA optimized coating design

    DEFF Research Database (Denmark)

    Ferreira, Desiree Della Monica; Christensen, Finn Erland; Jakobsen, Anders Clemen

    2012-01-01

    The optimization of coating design for the ATHENA mission si described and the possibility of increasing the telescope effective area in the range between 0.1 and 10 keV is investigated. An independent computation of the on-axis effective area based on the mirror design of ATHENA is performed in ...

  1. Optimization of reaction parameters for the electrochemical oxidation of lidocaine with a Design of Experiments approach

    NARCIS (Netherlands)

    Gul, Turan; Bischoff, Rainer; Permentier, Hjalmar

    2015-01-01

    Identification of potentially toxic oxidative drug metabolites is a crucial step in the development of new drugs. Electrochemical methods are useful to study oxidative drug metabolism, but are not widely used to synthesize metabolites for follow-up studies. Careful optimization of reaction parameter

  2. The Determination of the Optimal Material Proportion in Natural Fiber-Cement Composites Using Design of Mixture Experiments

    Directory of Open Access Journals (Sweden)

    Aramphongphun Chuckaphun

    2016-01-01

    Full Text Available This research aims to determine the optimal material proportion in a natural fiber-cement composite as an alternative to an asbestos fibercement composite while the materials cost is minimized and the properties still comply with Thai Industrial Standard (TIS for applications of profile sheet roof tiles. Two experimental sets were studied in this research. First, a three-component mixture of (i virgin natural fiber, (ii synthetic fiber and (iii cement was studied while the proportion of calcium carbonate was kept constant. Second, an additional material, recycled natural fiber from recycled paper, was used in the mixture. The four-component mixture was then studied. Constrained mixture design was applied to design the two experimental sets above. The experimental data were then analyzed to build the mixture model. In addition, the cost of each material was used to build the materials cost model. These two mathematical models were then employed to optimize the material proportion of the natural fiber-cement composites. In the three-component mixture, it was found that the optimal material proportion was as follows: 3.14% virgin natural fiber, 1.20% synthetic fiber and 75.67% cement while the materials cost was reduced by 12%. In the four-component mixture, it was found that the optimal material proportion was as follows: 3.00% virgin natural fiber, 0.50% recycled natural fiber, 1.08% synthetic fiber, and 75.42% cement. The materials cost was reduced by 14%. The confirmation runs of 30 experiments were also analyzed statistically to verify the results.

  3. Wind farm design optimization

    Energy Technology Data Exchange (ETDEWEB)

    Carreau, Michel; Morgenroth, Michael; Belashov, Oleg; Mdimagh, Asma; Hertz, Alain; Marcotte, Odile

    2010-09-15

    Innovative numerical computer tools have been developed to streamline the estimation, the design process and to optimize the Wind Farm Design with respect to the overall return on investment. The optimization engine can find the collector system layout automatically which provide a powerful tool to quickly study various alternative taking into account more precisely various constraints or factors that previously would have been too costly to analyze in details with precision. Our Wind Farm Tools have evolved through numerous projects and created value for our clients yielding Wind Farm projects with projected higher returns.

  4. Wear performance optimization of stir cast Al-TiB2 metal matrix composites using Taguchi design of experiments

    Science.gov (United States)

    Poria, Suswagata; Sahoo, Prasanta; Sutradhar, Goutam

    2016-09-01

    The present study outlines the use of Taguchi parameter design to minimize the wear performance of Al-TiB2 metal matrix composites by optimizing tribological process parameters. Different weight percentages of micro-TiB2 powders with average sizes of 5-40 micron are incorporated into molten LM4 aluminium matrix by stir casting method. The wear performance of Al-TiB2 composites is evaluated in a block-on-roller type Multitribo tester at room temperature. Three parameters viz. weight percentage of TiB2, load and speed are considered with three levels each at the time of experiment. A L27 orthogonal array is used to carry out experiments accommodating all the factors and their levels including their interaction effects. Optimal combination of parameters for wear performance is obtained by Taguchi analysis. Analysis of variance (ANOVA) is used to find out percentage contribution of each parameter and their interaction also on wear performance. Weight percentage of TiB2 is forced to be the most effective parameter in controlling wear behaviour of Al-TiB2 metal matrix composite.

  5. Optimization of a surfactant free polyol method for the synthesis of platinum-cobalt electrocatalysts using Taguchi design of experiments

    Energy Technology Data Exchange (ETDEWEB)

    Grolleau, C. [Laboratoire de Catalyse en Chimie Organique (LACCO), Universite de Poitiers, 40 av Recteur Pineau, F-86000 Poitiers (France); ST Microelectronics Tours, rue Pierre et Marie Curie, F-37100 Tours (France); Coutanceau, C.; Leger, J.-M. [Laboratoire de Catalyse en Chimie Organique (LACCO), Universite de Poitiers, 40 av Recteur Pineau, F-86000 Poitiers (France); Pierre, F. [ST Microelectronics Tours, rue Pierre et Marie Curie, F-37100 Tours (France)

    2010-03-15

    A design of experiments (derived from the Taguchi method) was implemented to optimize experimental conditions of a surfactant free polyol method for the synthesis of PtCo electrocatalysts. Considered responses were the active surface area and the catalytic activity toward oxygen reduction reaction. Metallic salt concentration, pH, temperature ramp, addition order of reactants and particle cleaning step were chosen as main parameters according to considerations coming from literature and previous experiments. Matrix models describing the behaviour of the synthesis system was elaborated taking into account the effects of each considered parameter and their interactions. From this model, an optimized PtCo/C catalyst, in terms of active surface area and activity towards the oxygen reduction reaction, was synthesized. Both the measured values of the active surface area and the electrocatalytic activity are in very good agreement with the calculated ones from the matrix model. Furthermore, actions of parameters and interactions between parameters can be better understood using this method. (author)

  6. Optimizing detection of RDX vapors using designed experiments for remote sensing

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Robert G.; Heredia-Langner, Alejandro; Warner, Marvin G.

    2014-03-24

    Abstract: This paper presents results of experiments performed to study the effect of four factors on the detection of RDX vapors from desorption into an atmospheric flow tube mass spectrometer (AFT-MS). The experiments initially included four independent factors: gas flow rate, desorption current, solvent evaporation time and RDX mass. The values of three detection responses, peak height, peak width, and peak area were recorded but only the peak height response was analyzed. Results from the first block of experiments indicated that solvent evaporation time was not statistically significant. A second round of experiments was performed using flow rate, current, and RDX mass as factors and the results were used to create a model to predict conditions resulting in maximum peak height. Those conditions were confirmed experimentally and used to obtain data for a calibration model. The calibration model represented RDX amounts ranging from 1 to 25 pg desorbed into an air flow of 7 L/min. Air samples from a shipping container that held 2 closed explosive storage magazines were collected on metal filaments for varying amounts for time ranging from 5 to 90 minutes. RDX was detected from all of the filaments sampled by desorption into the AFT-MS. From the calibration model, RDX vapor concentrations within the shipping container were calculated to be in the range of 1 to 50 parts-per-quadrillion from data collected on 2 separate days.

  7. Application of maximum entropy optimal projection design synthesis to the NASA Spacecraft Control Laboratory Experiment (SCOLE)

    Science.gov (United States)

    Hyland, Dave; Davis, Larry

    1984-01-01

    The scope of this study covered steady-state, continuous-time vibration control under disturbances applied to the Space Shuttle and continuous-time models of actuators, sensors, and disturbances. Focus was on a clear illustration of the methodology, therefore sensor/actuator dynamics were initially ignored, and a finite element model of the NASA Spacecraft Control Laboratory Experiment (SCOLE) was conducted, including products of inertia and offset of reflector CM from the mast tip.

  8. Optimization of ciprofloxacin complex loaded PLGA nanoparticles for pulmonary treatment of cystic fibrosis infections: Design of experiments approach.

    Science.gov (United States)

    Günday Türeli, Nazende; Türeli, Akif Emre; Schneider, Marc

    2016-12-30

    Design of Experiments (DoE) is a powerful tool for systematic evaluation of process parameters' effect on nanoparticle (NP) quality with minimum number of experiments. DoE was employed for optimization of ciprofloxacin loaded PLGA NPs for pulmonary delivery against Pseudomonas aeruginosa infections in cystic fibrosis (CF) lungs. Since the biofilm produced by bacteria was shown to be a complicated 3D barrier with heterogeneous meshes ranging from 100nm to 500nm, nanoformulations small enough to travel through those channels were assigned as target quality. Nanoprecipitation was realized utilizing MicroJet Reactor (MJR) technology based on impinging jets principle. Effect of MJR parameters flow rate, temperature and gas pressure on particle size and PDI was investigated using Box-Behnken design. The relationship between process parameters and particle quality was demonstrated by constructed fit functions (R(2)=0.9934 p65%. Response surface plots provided experimental data-based understanding of MJR parameters' effect, thus NP quality. Presented work enables ciprofloxacin loaded PLGA nanoparticle preparations with pre-defined quality to fulfill the requirements of local drug delivery under CF disease conditions.

  9. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations

    Science.gov (United States)

    Pedro, Liliana; Van Voorhis, Wesley C.; Quinn, Ronald J.

    2016-09-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase ( PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each.

  10. Optimization of operating parameters for efficient photocatalytic inactivation of Escherichia coli based on a statistical design of experiments.

    Science.gov (United States)

    Feilizadeh, Mehrzad; Alemzadeh, Iran; Delparish, Amin; Estahbanati, M R Karimi; Soleimani, Mahdi; Jangjou, Yasser; Vosoughi, Amin

    2015-01-01

    In this work, the individual and interaction effects of three key operating parameters of the photocatalytic disinfection process were evaluated and optimized using response surface methodology (RSM) for the first time. The chosen operating parameters were: reaction temperature, initial pH of the reaction mixture and TiO2 P-25 photocatalyst loading. Escherichia coli concentration, after 90 minutes irradiation of UV-A light, was selected as the response. Twenty sets of photocatalytic disinfection experiments were conducted by adjusting operating parameters at five levels using the central composite design. Based on the experimental data, a semi-empirical expression was established and applied to predict the response. Analysis of variance revealed a strong correlation between predicted and experimental values of the response. The optimum values of the reaction temperature, initial pH of the reaction mixture and photocatalyst loading were found to be 40.3 °C, 5.9 g/L, and 1.0 g/L, respectively. Under the optimized conditions, E. coli concentration was observed to reduce from 10(7) to about 11 CFU/mL during the photocatalytic process. Moreover, all these results showed the great significance of the RSM in developing high performance processes for photocatalytic water disinfection.

  11. Design and sampling plan optimization for RT-qPCR experiments in plants: a case study in blueberry

    Directory of Open Access Journals (Sweden)

    Jose V Die

    2016-03-01

    Full Text Available The qPCR assay has become a routine technology in plant biotechnology and agricultural research. It is unlikely to be technically improved, but there are still challenges which center around minimizing the variability in results and transparency when reporting technical data in support of the conclusions of a study. There are a number of aspects of the pre- and post-assay workflow that contribute to variability of results. Here, through the study of the introduction of error in qPCR measurements at different stages of the workflow, we describe the most important causes of technical variability in a case study using blueberry. In this study, we found that the stage for which increasing the number of replicates would be the most beneficial depends on the tissue used. For example, we would recommend the use of more RT replicates when working with leaf tissue, while the use of more sampling (RNA extraction replicates would be recommended when working with stems or fruits to obtain the most optimal results. The use of more qPCR replicates provides the least benefit as it is the most reproducible step. By knowing the distribution of error over an entire experiment and the costs at each step, we have developed a script to identify the optimal sampling plan within the limits of a given budget. These findings should help plant scientists improve the design of qPCR experiments and refine their laboratory practices in order to conduct qPCR assays in a more reliable-manner to produce more consistent and reproducible data.

  12. Design and Sampling Plan Optimization for RT-qPCR Experiments in Plants: A Case Study in Blueberry.

    Science.gov (United States)

    Die, Jose V; Roman, Belen; Flores, Fernando; Rowland, Lisa J

    2016-01-01

    The qPCR assay has become a routine technology in plant biotechnology and agricultural research. It is unlikely to be technically improved, but there are still challenges which center around minimizing the variability in results and transparency when reporting technical data in support of the conclusions of a study. There are a number of aspects of the pre- and post-assay workflow that contribute to variability of results. Here, through the study of the introduction of error in qPCR measurements at different stages of the workflow, we describe the most important causes of technical variability in a case study using blueberry. In this study, we found that the stage for which increasing the number of replicates would be the most beneficial depends on the tissue used. For example, we would recommend the use of more RT replicates when working with leaf tissue, while the use of more sampling (RNA extraction) replicates would be recommended when working with stems or fruits to obtain the most optimal results. The use of more qPCR replicates provides the least benefit as it is the most reproducible step. By knowing the distribution of error over an entire experiment and the costs at each step, we have developed a script to identify the optimal sampling plan within the limits of a given budget. These findings should help plant scientists improve the design of qPCR experiments and refine their laboratory practices in order to conduct qPCR assays in a more reliable-manner to produce more consistent and reproducible data.

  13. Mechanical Design Optimization Using Advanced Optimization Techniques

    CERN Document Server

    Rao, R Venkata

    2012-01-01

    Mechanical design includes an optimization process in which designers always consider objectives such as strength, deflection, weight, wear, corrosion, etc. depending on the requirements. However, design optimization for a complete mechanical assembly leads to a complicated objective function with a large number of design variables. It is a good practice to apply optimization techniques for individual components or intermediate assemblies than a complete assembly. Analytical or numerical methods for calculating the extreme values of a function may perform well in many practical cases, but may fail in more complex design situations. In real design problems, the number of design parameters can be very large and their influence on the value to be optimized (the goal function) can be very complicated, having nonlinear character. In these complex cases, advanced optimization algorithms offer solutions to the problems, because they find a solution near to the global optimum within reasonable time and computational ...

  14. ATHENA optimized coating design

    DEFF Research Database (Denmark)

    Ferreira, Desiree Della Monica; Christensen, Finn Erland; Jakobsen, Anders Clemen

    2012-01-01

    baseline including on- and off-axis effective area curves are presented. We find that the use of linear graded multilayers can increas by 37% the integraed effective area of ATHENA in the energy range between 0.1 keV and 15keV.© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE......The optimization of coating design for the ATHENA mission si described and the possibility of increasing the telescope effective area in the range between 0.1 and 10 keV is investigated. An independent computation of the on-axis effective area based on the mirror design of ATHENA is performed...

  15. Optimization methods in structural design

    CERN Document Server

    Rothwell, Alan

    2017-01-01

    This book offers an introduction to numerical optimization methods in structural design. Employing a readily accessible and compact format, the book presents an overview of optimization methods, and equips readers to properly set up optimization problems and interpret the results. A ‘how-to-do-it’ approach is followed throughout, with less emphasis at this stage on mathematical derivations. The book features spreadsheet programs provided in Microsoft Excel, which allow readers to experience optimization ‘hands-on.’ Examples covered include truss structures, columns, beams, reinforced shell structures, stiffened panels and composite laminates. For the last three, a review of relevant analysis methods is included. Exercises, with solutions where appropriate, are also included with each chapter. The book offers a valuable resource for engineering students at the upper undergraduate and postgraduate level, as well as others in the industry and elsewhere who are new to these highly practical techniques.Whi...

  16. Sequential and simultaneous statistical optimization by dynamic design of experiment for peptide overexpression in recombinant Escherichia coli.

    Science.gov (United States)

    Lee, Kwang-Min; Rhee, Chang-Hoon; Kang, Choong-Kyung; Kim, Jung-Hoe

    2006-10-01

    The production of recombinant anti-HIV peptide, T-20, in Escherichia coli was optimized by statistical experimental designs (successive designs with multifactors) such as 2(4-1) fractional factorial, 2(3) full factorial, and 2(2) rotational central composite design in order. The effects of media compositions (glucose, NPK sources, MgSO4, and trace elements), induction level, induction timing (optical density at induction process), and induction duration (culture time after induction) on T-20 production were studied by using a statistical response surface method. A series of iterative experimental designs was employed to determine optimal fermentation conditions (media and process factors). Optimal ranges characterized by %T-20 (proportion of peptide to the total cell protein) were observed, narrowed down, and further investigated to determine the optimal combination of culture conditions, which was as follows: 9, 6, 10, and 1 mL of glucose, NPK sources, MgSO4, and trace elements, respectively, in a total of 100 mL of medium inducted at an OD of 0.55-0.75 with 0.7 mM isopropyl-beta-D-thiogalactopyranoside in an induction duration of 4 h. Under these conditions, up to 14% of T-20 was obtained. This statistical optimization allowed the production of T-20 to be increased more than twofold (from 6 to 14%) within a shorter induction duration (from 6 to 4 h) at the shake-flask scale.

  17. Design and optimization of a chromatographic purification process for Streptococcus pneumoniae serotype 23F capsular polysaccharide by a Design of Experiments approach.

    Science.gov (United States)

    Ji, Yu; Tian, Yang; Ahnfelt, Mattias; Sui, Lili

    2014-06-27

    Multivalent pneumococcal vaccines were used worldwide to protect human beings from pneumococcal diseases. In order to eliminate the toxic organic solutions used in the traditional vaccine purification process, an alternative chromatographic process for Streptococcus pneumoniae serotype 23F capsular polysaccharide (CPS) was proposed in this study. The strategy of Design of Experiments (DoE) was introduced into the process development to solve the complicated design procedure. An initial process analysis was given to review the whole flowchart, identify the critical factors of chromatography through FMEA and chose the flowthrough mode due to the property of the feed. A resin screening study was then followed to select candidate resins. DoE was utilized to generate a resolution IV fractional factorial design to further compare candidates and narrow down the design space. After Capto Adhere was selected, the Box-Behnken DoE was executed to model the process and characterize all effects of factors on the responses. Finally, Monte Carlo simulation was used to optimize the process, test the chosen optimal conditions and define the control limit. The results of three scale-up runs at set points verified the DoE and simulation predictions. The final results were well in accordance with the EU pharmacopeia requirements: Protein/CPS (w/w) 1.08%; DNA/CPS (w/w) 0.61%; the phosphorus content 3.1%; the nitrogen 0.315% and the Methyl-pentose percentage 47.9%. Other tests of final pure CPS also met the pharmacopeia specifications. This alternative chromatographic purification process for pneumococcal vaccine without toxic organic solvents was successfully developed by the DoE approach and proved scalability, robustness and suitability for large scale manufacturing.

  18. Optimal crossover designs for the proportional model

    OpenAIRE

    Zheng, Wei

    2013-01-01

    In crossover design experiments, the proportional model, where the carryover effects are proportional to their direct treatment effects, has draw attentions in recent years. We discover that the universally optimal design under the traditional model is E-optimal design under the proportional model. Moreover, we establish equivalence theorems of Kiefer-Wolfowitz's type for four popular optimality criteria, namely A, D, E and T (trace).

  19. OPTIMAL NETWORK TOPOLOGY DESIGN

    Science.gov (United States)

    Yuen, J. H.

    1994-01-01

    This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.

  20. A fault-tolerant triple-redundant voice coil motor for direct drive valves: Design, optimization, and experiment

    Institute of Scientific and Technical Information of China (English)

    Wu Shuai; Jiao Zongxia; Yan Liang; Yu Juntao; Chen Chin-Yin

    2013-01-01

    A direct drive actuator (DDA) with direct drive valves (DDVs) as the control device is an ideal solution for a flight actuation system.This paper presents a novel triple-redundant voice coil motor (TRVCM) used for redundant DDVs.The TRVCM features electrical/mechanical hybrid triple-redundancy by securing three stators along with three moving coils in the same frame.A permanent magnet (PM) Halbach array is employed in each redundant VCM to simplify the system structure.A back-to-back design between neighborly redundancies is adopted to decouple the magnetic flux linkage.The particle swarm optimization (PSO) method is implemented to optimize design parameters based on the analytical magnetic circuit model.The optimization objective function is defined as the acceleration capacity of the motor to achieve high dynamic performance.The optimal geometric parameters are verified with 3D magnetic field finite element analysis (FEA).A research prototype has been developed for experimental purpose.The experimental results of magnetic field density and force output show that the proposed TRVCM has great potential of applications in DDA systems.

  1. Theory of second optimization for scan experiment

    CERN Document Server

    Mo, X H

    2015-01-01

    The optimal design of scan experiment is of great significance both for scientific research and from economical viewpoint. Two approaches, one has recourse to the sampling technique and the other resorts to the analytical proof, are adopted to figure out the optimized scan scheme for the relevant parameters. The final results indicate that for $n$ parameters scan experiment, $n$ energy points are necessary and sufficient for optimal determination of these $n$ parameters; each optimal position can be acquired by single parameter scan (sampling method), or by analysis of auxiliary function (analytic method); the luminosity allocation among the points can be determined analytically with respect to the relative importance between parameters. By virtue of the second optimization theory established in this paper, it is feasible to accommodate the perfectly optimal scheme for any scan experiment.

  2. Integrated controls design optimization

    Science.gov (United States)

    Lou, Xinsheng; Neuschaefer, Carl H.

    2015-09-01

    A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.

  3. Optimized design for PIGMI

    Energy Technology Data Exchange (ETDEWEB)

    Hansborough, L.; Hamm, R.; Stovall, J.; Swenson, D.

    1980-01-01

    PIGMI (Pion Generator for Medical Irradiations) is a compact linear proton accelerator design, optimized for pion production and cancer treatment use in a hospital environment. Technology developed during a four-year PIGMI Prototype experimental program allows the design of smaller, less expensive, and more reliable proton linacs. A new type of low-energy accelerating structure, the radio-frequency quadrupole (RFQ) has been tested; it produces an exceptionally good-quality beam and allows the use of a simple 30-kV injector. Average axial electric-field gradients of over 9 MV/m have been demonstrated in a drift-tube linac (DTL) structure. Experimental work is underway to test the disk-and-washer (DAW) structure, another new type of accelerating structure for use in the high-energy coupled-cavity linac (CCL). Sufficient experimental and developmental progress has been made to closely define an actual PIGMI. It will consist of a 30-kV injector, and RFQ linac to a proton energy of 2.5 MeV, a DTL linac to 125 MeV, and a CCL linac to the final energy of 650 MeV. The total length of the accelerator is 133 meters. The RFQ and DTL will be driven by a single 440-MHz klystron; the CCL will be driven by six 1320-MHz klystrons. The peak beam current is 28 mA. The beam pulse length is 60 ..mu..s at a 60-Hz repetition rate, resulting in a 100-..mu..A average beam current. The total cost of the accelerator is estimated to be approx. $10 million.

  4. Optimization Design for Digital Binoculars

    Institute of Scientific and Technical Information of China (English)

    CEN Jun-bo; CHEN Wei-min; LI Hui; HUANG Shang-lian

    2005-01-01

    In order to develop competitive and high performance/cost ratio of digital binoculars, design scheme should be optimized in term of technical capacity, economic benefit, product performance, risk management, etc. The common optimization method is limited in qualitative analysis, and the parameter optimization method is limited in obtaining optimal parameter only from technical side. Each method has its limitation. Based on the analysis of digital binoculars parameters, optional design schemes are laid down.Analytic hierarchy process combined the qualitative analysis with the quantitative analysis together. The design schemes are optimized, and result is worked out.

  5. An approach to optimize sample preparation for MALDI imaging MS of FFPE sections using fractional factorial design of experiments.

    Science.gov (United States)

    Oetjen, Janina; Lachmund, Delf; Palmer, Andrew; Alexandrov, Theodore; Becker, Michael; Boskamp, Tobias; Maass, Peter

    2016-09-01

    A standardized workflow for matrix-assisted laser desorption/ionization imaging mass spectrometry (MALDI imaging MS) is a prerequisite for the routine use of this promising technology in clinical applications. We present an approach to develop standard operating procedures for MALDI imaging MS sample preparation of formalin-fixed and paraffin-embedded (FFPE) tissue sections based on a novel quantitative measure of dataset quality. To cover many parts of the complex workflow and simultaneously test several parameters, experiments were planned according to a fractional factorial design of experiments (DoE). The effect of ten different experiment parameters was investigated in two distinct DoE sets, each consisting of eight experiments. FFPE rat brain sections were used as standard material because of low biological variance. The mean peak intensity and a recently proposed spatial complexity measure were calculated for a list of 26 predefined peptides obtained by in silico digestion of five different proteins and served as quality criteria. A five-way analysis of variance (ANOVA) was applied on the final scores to retrieve a ranking of experiment parameters with increasing impact on data variance. Graphical abstract MALDI imaging experiments were planned according to fractional factorial design of experiments for the parameters under study. Selected peptide images were evaluated by the chosen quality metric (structure and intensity for a given peak list), and the calculated values were used as an input for the ANOVA. The parameters with the highest impact on the quality were deduced and SOPs recommended.

  6. A design-of-experiments approach for the optimization and understanding of the cross-metathesis reaction of methyl ricinoleate with methyl acrylate.

    Science.gov (United States)

    Ho, Thao T T; Jacobs, Tina; Meier, Michael A R

    2009-01-01

    A design-of-experiments approach for the investigation of the cross-metathesis of methyl ricinoleate with methyl acrylate is described. Two second-generation metathesis initiators were studied using different reaction conditions, revealing optimal reaction conditions for each catalyst. Interestingly, the two catalysts showed completely different temperature response profiles. As a result of these investigations, suitable reaction conditions for the sustainable production of two value-added chemical intermediates were derived. Moreover, the design-of-experiments approach provided valuable information for a thorough understanding of catalytic reactions that would be more difficult to obtain by classic approaches.

  7. RELIABILITY AND DESIGN OF EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Adrian Stere PARIS

    2013-05-01

    Full Text Available The mechanical reliability uses many statistical tools to find the factors of influence and their levels inthe optimization of parameters on the basis of experimental data. Design of Experiments (DOE techniquesenables designers to determine simultaneously the individual and interactive effects of many factors that couldaffect the output results in any design. The state-of-the-art in the domain implies extended use of software and abasic mathematical knowledge, mainly applying ANOVA and the regression analysis of experimental data.

  8. Optimal design for nonlinear response models

    CERN Document Server

    Fedorov, Valerii V

    2013-01-01

    Optimal Design for Nonlinear Response Models discusses the theory and applications of model-based experimental design with a strong emphasis on biopharmaceutical studies. The book draws on the authors' many years of experience in academia and the pharmaceutical industry. While the focus is on nonlinear models, the book begins with an explanation of the key ideas, using linear models as examples. Applying the linearization in the parameter space, it then covers nonlinear models and locally optimal designs as well as minimax, optimal on average, and Bayesian designs. The authors also discuss ada

  9. Satisfactory Optimization Design of IIR Digital Filters

    Institute of Scientific and Technical Information of China (English)

    Jin Weidong; Zhang Gexiang; Zhao Duo

    2005-01-01

    A new method called satisfactory optimization method is proposed to design IIR (Infinite Impulse Response) digital filters, and the satisfactory optimization model is presented. The detailed algorithm of designing IIR digital filters using satisfactory optimization method is described. By using quantum genetic algorithm characterized by rapid convergence and good global search capability, the satisfying solutions are achieved in the experiment of designing lowpass and bandpass IIR digital filters. Experimental results show that the performances of IIR filters designed by the introduced method are better than those by traditional methods.

  10. The optimization design of experiment platform by gear train design%轮系设计实验台优化设计

    Institute of Scientific and Technical Information of China (English)

    孙孟琴; 李水清; 武向鹏; 许云成

    2014-01-01

    T his paper presents a new type of detachable gear train experimental platform .It can connect many types of gear train (the fixed axis gear train ,the differential gear train ,the planetary gear train ,the mixed gear train ,the gear transmission and so on) ,clears showing its structure .The Design and algorithm of gear parameters have been optimized ,so the experimental platform has meet the requirements of good stability and reliability .It improves the students'practical ability ,observing and analytical ability ,satisfied the strengthening of teaching experiment practices in university .%设计一种新型可拆装轮系实验台,能搭接多种类型的结构(定轴轮系、差动轮系、行星轮系、混合轮系、齿轮传动等),清楚地展示了其内部结构。优化了齿轮等参数的设计与计算,使实验台达到可靠性与稳定性良好的要求。提高学生动手与观察分析能力,满足现在加强高校实验实践教学的要求。

  11. New approach to optimize near-infrared spectra with design of experiments and determination of milk compounds as influence factors for changing milk over time.

    Science.gov (United States)

    De Benedictis, Lorenzo; Huck, Christian

    2016-12-01

    The optimization of near-infrared spectroscopic parameters was realized via design of experiments. With this new approach objectivity can be integrated into conventional, rather subjective approaches. The investigated factors are layer thickness, number of scans and temperature during measurement. Response variables in the full factorial design consisted of absorption intensity, signal-to-noise ratio and reproducibility of the spectra. Optimized factorial combinations have been found to be 0.5mm layer thickness, 64 scans and 25°C ambient temperature for liquid milk measurements. Qualitative analysis of milk indicated a strong correlation of environmental factors, as well as the feeding of cattle with respect to the change in milk composition. This was illustrated with the aid of near-infrared spectroscopy and the previously optimized parameters by detection of altered fatty acids in milk, especially by the fatty acid content (number of carboxylic functions) and the fatty acid length.

  12. Chitosan polyplex nanoparticle vector for miR-145 expression in MCF-7: Optimization by design of experiment.

    Science.gov (United States)

    Tekie, Farnaz Sadat Mirzazadeh; Atyabi, Fatemeh; Soleimani, Masoud; Arefian, Ehsan; Atashi, Amir; Kiani, Melika; Khoshayand, Mohammad Reza; Amini, Mohsen; Dinarvand, Rassoul

    2015-11-01

    miR-145, a tumor suppressor micro RNA (miRNA), is down regulated in cancer and can be introduced as a therapeutic agent in various cancers including breast cancer. In this study, miR-145 plasmid was transfected to MCF-7 cells using chitosan polyplex nanoparticles. The vector was prepared according to an optimized fabricating method determined by response surface analysis and D-optimal design. Effects of chitosan molecular weight (Mw) and polymer amine to DNA phosphate ratio (N/P) as the variables were investigated on size, zeta potential, stability, and transfection efficiency of the polyplex nanoparticles. The results indicated that there is an interaction between effects of Mw and N/P ratio on the size of nanoparticles. Gel retardation assay demonstrated that the stability of the complexes in serum and preparation medium during storage time depends on the formulation variables. Statistical analysis affirmed that in spite of particle size, the variables of N/P ratio, time of incubation, and zeta potential affect the gene transfection. In conclusion, by selecting the perfect formulation prepared through an optimized method, it is possible to achieve a high transfection efficacy for miR-145 as an anticancer biological macromolecule.

  13. Multidisciplinary design using collaborative optimization

    Science.gov (United States)

    Sobieski, Ian Patrick

    Management of the modern aircraft design process is a substantial challenge. Formal iterative optimization is commonly used with disciplinary design tools to aid designers in the definition of optimal subsystems. However, the expense in executing high fidelity analysis, the decomposition of the design expertise into disciplines, and the size of the design space, often precludes the use of direct optimization in the overall design process. Collaborative optimization is a recently developed methodology that shows promise in enabling formal optimization of the overall design. The architecture preserves disciplinary design autonomy while providing a coordinating mechanism that leads to interdisciplinary agreement and improved designs. The basic formulation has been applied to a variety of sample design problems which demonstrate that the method successfully discovers correct optimal solutions. This work places collaborative optimization in the context of other multidisciplinary design optimization methods and characterizes problems for which the basic formulation is applicable. Artifacts of the problem formulation are discussed and methods for handling high bandwidth coupling, such as that found in aeroelasticity, are presented. The use of response surfaces for representing expensive analyses has become increasingly popular in design optimization. Response surfaces are smooth analytic functions that are inexpensive to evaluate and may be generated from data points obtained from the parallel execution of analyses. These properties motivate the introduction of response surfaces into collaborative optimization. Response surfaces have been previously used to model subproblem analyses and were generated just once. Here, approximate models are used to represent the subproblem optimization results, not the analysis, and are regenerated as the design is modified. The use of response surfaces in collaborative optimization requires an inexpensive method for generating the

  14. Optimization of LC-Orbitrap-HRMS acquisition and MZmine 2 data processing for nontarget screening of environmental samples using design of experiments.

    Science.gov (United States)

    Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias

    2016-11-01

    Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS(2) experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10(5) in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.

  15. Optimal design criteria - prediction vs. parameter estimation

    Science.gov (United States)

    Waldl, Helmut

    2014-05-01

    G-optimality is a popular design criterion for optimal prediction, it tries to minimize the kriging variance over the whole design region. A G-optimal design minimizes the maximum variance of all predicted values. If we use kriging methods for prediction it is self-evident to use the kriging variance as a measure of uncertainty for the estimates. Though the computation of the kriging variance and even more the computation of the empirical kriging variance is computationally very costly and finding the maximum kriging variance in high-dimensional regions can be time demanding such that we cannot really find the G-optimal design with nowadays available computer equipment in practice. We cannot always avoid this problem by using space-filling designs because small designs that minimize the empirical kriging variance are often non-space-filling. D-optimality is the design criterion related to parameter estimation. A D-optimal design maximizes the determinant of the information matrix of the estimates. D-optimality in terms of trend parameter estimation and D-optimality in terms of covariance parameter estimation yield basically different designs. The Pareto frontier of these two competing determinant criteria corresponds with designs that perform well under both criteria. Under certain conditions searching the G-optimal design on the above Pareto frontier yields almost as good results as searching the G-optimal design in the whole design region. In doing so the maximum of the empirical kriging variance has to be computed only a few times though. The method is demonstrated by means of a computer simulation experiment based on data provided by the Belgian institute Management Unit of the North Sea Mathematical Models (MUMM) that describe the evolution of inorganic and organic carbon and nutrients, phytoplankton, bacteria and zooplankton in the Southern Bight of the North Sea.

  16. Multicopter Design Optimization and Validation

    Directory of Open Access Journals (Sweden)

    Øyvind Magnussen

    2015-04-01

    Full Text Available This paper presents a method for optimizing the design of a multicopter unmanned aerial vehicle (UAV, also called multirotor or drone. In practice a set of datasheets is available to the designer for the various components such as battery pack, motor and propellers. The designer can not normally design the parameters of the actuator system freely, but is constrained to pick components based on available datasheets. The mixed-integer programming approach is well suited to design optimization in such cases when only a discrete set of components is available. The paper also includes an experimental section where the simulated dynamic responses of optimized designs are compared against the experimental results. The paper demonstrates that mixed-integer programming is well suited to design optimization of multicopter UAVs and that the modeling assumptions match well with the experimental validation.

  17. Optimal Hospital Layout Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine

    This PhD project presents a design model that generates and evaluates hospital designs with respect to long-term performances and functionalities. By visualizing and quantifying costs and performances in the early design phases, it is possible to make design choices based on a qualified, profound...... foundation. The basis of the present study lies in solving the architectural design problem in order to respond to functionalities and performances. The emphasis is the practical applicability for architects, engineers and hospital planners for assuring usability and a holistic approach of functionalities...... and performances. By formal descriptions, a design model can weigh and compare the impact of different perspectives and, even in the early design phase, it can visualize and quantify consequences for design choices. By qualitative study of hospital design and hospital functionality, formal descriptions develop...

  18. Designing learning experiences

    National Research Council Canada - National Science Library

    Collins, Jannette

    2007-01-01

    Creation of significant learning experiences follows basic steps of instructional design related to situational factors, goals and objectives, feedback and evaluation methods, teaching and learning...

  19. Detailed design package for design of a video system providing optimal visual information for controlling payload and experiment operations with television

    Science.gov (United States)

    1975-01-01

    A detailed description of a video system for controlling space shuttle payloads and experiments is presented in the preliminary design review and critical design review, first and second engineering design reports respectively, and in the final report submitted jointly with the design package. The material contained in the four subsequent sections of the package contains system descriptions, design data, and specifications for the recommended 2-view system. Section 2 contains diagrams relating to the simulation test configuration of the 2-view system. Section 3 contains descriptions and drawings of the deliverable breadboard equipment. A description of the recommended system is contained in Section 4 with equipment specifications in Section 5.

  20. Parametric Optimization of Hospital Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning; Christoffersen, L.D.

    2013-01-01

    Present paper presents a parametric performancebased design model for optimizing hospital design. The design model operates with geometric input parameters defining the functional requirements of the hospital and input parameters in terms of performance objectives defining the design requirements...... and preferences of the hospital with respect to performances. The design model takes point of departure in the hospital functionalities as a set of defined parameters and rules describing the design requirements and preferences....

  1. Parametric Optimization of Hospital Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning; Christoffersen, L.D.

    2013-01-01

    Present paper presents a parametric performancebased design model for optimizing hospital design. The design model operates with geometric input parameters defining the functional requirements of the hospital and input parameters in terms of performance objectives defining the design requirements...... and preferences of the hospital with respect to performances. The design model takes point of departure in the hospital functionalities as a set of defined parameters and rules describing the design requirements and preferences....

  2. Methodized depiction of design of experiment for parameters optimization in synthesis of poly(Nvinylcaprolactam) thermoresponsive polymers

    Science.gov (United States)

    Mohammed, Marwah N.; Yusoh, Kamal Bin; Haji Shariffuddin, Jun Haslinda Binti

    2016-12-01

    Recently, common research on stimuli-responsive polymers comprising thermoresponsive polymers has been widely investigated. In this research study, the synthesis process parameters of poly(Nvinylcaprolactam) (PNVCL) a thermoresponsive polymer, has been engaged for optimization as an attempt. The response surface methodology (RSM), has been employed in the identification of the elevated factors affecting on PNVCL production conversion (%) yield. Four independent process variables including monomer concentration, initiator concentration, polymerization temperature and time were studied. Various polymerization combination factors consist of a set of experiment runs were discussed using the Box-Behnken approach in Minitab 16. The study efficiently established the procedure and recompenses of RSM, for the estimation of process response. The optimum value for the most significant (temperature and time) variables for maximum PNVCL conversion (%) yield were obtained to be ˜80 °C and 92.5 min, respectively. Monomer and initiator concentrations were hardly effective on the (%) yield.

  3. Design and optimization of a gas-puff nozzle for staged Z-pinch experiments using computational fluid dynamics simulations

    Science.gov (United States)

    Valenzuela, J. C.; Krasheninnikov, I.; Beg, F. N.; Wessel, F.; Rahman, H.; Ney, P.; Presura, R.; McKee, E.; Darling, T.; Covington, A.

    2015-11-01

    Previous experimental work on staged Z-pinches demonstrated that gas liners can efficiently couple energy and implode uniformly a target-plasma. A 1.5 MA, 1 μs current driver was used to implode a magnetized, Kr liner onto a D + target, producing 1010 neutrons per shot and providing clear evidence of enhanced pinch stability. Time-of-flight data suggest that primary and secondary neutrons were produced. MHD simulations show that in Zebra, a 1.5MA and 100ns rise-time current driver, high fusion gain can be attained when the optimum liner and plasma target conditions are used. In this work we present the design and optimization of a liner-on-target nozzle to be fielded in Zebra and demonstrate high fusion gain at 1 MA current level. The nozzle is composed of an annular high atomic number gas-puff and an on-axis plasma gun that will deliver the ionized deuterium target. The nozzle optimization was carried out using the computational fluid dynamics (CFD) code fluent and the MHD code Mach2. The CFD simulation produces density and temperature profiles, as a function of the nozzle shapes and gas conditions, which are then used in Mach2 to find the optimum plasma liner implosion-pinch conditions. Funded by the US Department of Energy, ARPA-E, Control Number 1184-1527.

  4. Optimization of production of the anti-keratin 8 single-chain Fv TS1-218 in Pichia pastoris using design of experiments

    Directory of Open Access Journals (Sweden)

    Sundström Birgitta E

    2011-05-01

    Full Text Available Abstract Background Optimization of conditions during recombinant protein production for improved yield is a major goal for protein scientists. Typically this is achieved by changing single crucial factor settings one at a time while other factors are kept fixed through trial-and-error experimentation. This approach may introduce larger bias and fail to identify interactions between the factors resulting in failure of finding the true optimal conditions. Results In this study we have utilized design of experiments in order to identify optimal culture conditions with the aim to improve the final yield of the anti-keratin 8 scFv TS1-218, during expression in P. pastoris in shake flasks. The effect of: pH, temperature and methanol concentration on the yield of TS1-218 using buffered minimal medium was investigated and a predictive model established. The results demonstrated that higher starting pH and lower temperatures during induction significantly increased the yield of TS1-218. Furthermore, the result demonstrated increased biomass accumulation and cell viability at lower temperatures which suggested that the higher yield of TS1-218 could be attributed to lower protease activity in the culture medium. The optimal conditions (pH 7.1, temperature 11°C and methanol concentration 1.2% suggested by the predictive model yielded 21.4 mg TS1-218 which is a 21-fold improvement compared to the yield prior to optimization. Conclusion The results demonstrated that design of experiments can be utilized for a rapid optimization of initial culture conditions and that P. pastoris is highly capable of producing and secreting functional single-chain antibody fragments at temperatures as low as 11°C.

  5. Designing Urban Experiences

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2008-01-01

    traditional urban planning aspects such as infrastructure, environmental factors and aesthetics, but has also dealt with the design of urban experiences. Through an introduction of the framework of the structure of experiences, this article examines how urban experiences can be understood and analysed...

  6. Experimenting with a design experiment

    Directory of Open Access Journals (Sweden)

    Bakker, Judith

    2012-12-01

    Full Text Available The design experiment is an experimental research method that aims to help design and further develop new (policy instruments. For the development of a set of guidelines for the facilitation of citizens’ initiatives by local governments, we are experimenting with this method. It offers good opportunities for modeling interventions by testing their instrumental validity –the usefulness for the intended practical purposes. At the same time design experiments are also useful for evaluating the empirical validity of theoretical arguments and the further development of these arguments in the light of empirical evidence (by using e.g. the technique of pattern matching. We describe how we have applied this methodology in two cases and discuss our research approach. We encountered some unexpected difficulties, especially in the cooperation with professionals and citizens. These difficulties complicate the valid attribution of causal effects to the use of the new instrument. However, our preliminary conclusion is that design experiments are useful in our field of study

    El experimento de diseño es un método de investigación experimental que tiene como objetivo diseñar y desarrollar posteriormente nuevas herramientas (políticas. En este artículo experimentamos con este método para desarrollar un conjunto de directrices que permitan a los gobiernos locales facilitar las iniciativas ciudadanas. El método ofrece la oportunidad de modelar las intervenciones poniendo a prueba su validez instrumental (su utilidad para el fin práctico que se proponen. Al mismo tiempo, los experimentos de diseño son útiles también para evaluar la validez empírica de las discusiones teóricas y el posterior desarrollo de esas discusiones a la luz de la evidencia empírica (usando, por ejemplo, técnicas de concordancia de patrones. En este trabajo describimos cómo hemos aplicado este método a dos casos y discutimos nuestro enfoque de

  7. Characterization and optimization of electrospun TiO2/PVP nanofibers using Taguchi design of experiment method

    Directory of Open Access Journals (Sweden)

    H. Albetran

    2015-09-01

    Full Text Available TiO2 nanofibers were prepared within polyvinylpyrrolidone (PVP polymer using a combination of sol–gel and electrospinning techniques. Based on a Taguchi design of experiment (DoE method, the effects of sol–gel and electrospinning on the TiO2/PVP nanofibers’ diameter, including titanium isopropoxide (TiP concentration, flow rate, needle tip-to-collector distance, and applied voltage were evaluated. The analysis of DoE experiments for nanofiber diameters demonstrated that TiP concentration was the most significant factor. An optimum combination to obtain smallest diameters was also determined with a minimum variation for electrospun TiO2/PVP nanofibers. The optimum combination was determined to be a 60% TiP concentration, at a flow rate of 1 ml/h, with the needle tip-to-collector distance at 11 cm (position a, and the applied voltage of 18 kV. This combination was further validated by conducting a confirmation experiment that used two different needles to study the effect of needle size. The average nanofiber diameter was approximately the same for both needle sizes in good accordance with the optimum condition estimated by the Taguchi DoE method.

  8. On Adaptive Optimal Input Design

    NARCIS (Netherlands)

    Stigter, J.D.; Vries, D.; Keesman, K.J.

    2003-01-01

    The problem of optimal input design (OID) for a fed-batch bioreactor case study is solved recursively. Here an adaptive receding horizon optimal control problem, involving the so-called E-criterion, is solved on-line, using the current estimate of the parameter vector at each sample instant {tk, k =

  9. Optimal Design of Porous Materials

    DEFF Research Database (Denmark)

    Andreassen, Erik

    The focus of this thesis is topology optimization of material microstructures. That is, creating new materials, with attractive properties, by combining classic materials in periodic patterns. First, large-scale topology optimization is used to design complicated three-dimensional materials with ...

  10. Hydrodynamic Design Optimization Tool

    Science.gov (United States)

    2011-08-01

    techniques (Yang et al., 2008; Kim et al. 2010): (i) A modified NURBS technique is combined with a parametric global hull modification technique by...varying the sectional area curve with a shifting method, in which the design variables in the NURBS technique can be reduced via a grouping method after

  11. Optimized simultaneous saccharification and co-fermentation of rice straw for ethanol production by Saccharomyces cerevisiae and Scheffersomyces stipitis co-culture using design of experiments.

    Science.gov (United States)

    Suriyachai, Nopparat; Weerasaia, Khatiya; Laosiripojana, Navadol; Champreda, Verawat; Unrean, Pornkamol

    2013-08-01

    Herein an ethanol production process from rice straw was optimized. Simultaneous saccharification and co-fermentation (SSCF) using Saccharomyces cerevisiae and Scheffersomyces stipitis co-culture was carried out to enhance ethanol production. The optimal saccharification solid loading was 5%. Key fermentation parameters for co-culture including cell ratio, agitation rate and temperature was rationally optimized using design of experiment (DoE). Optimized co-culture conditions for maximum ethanol production efficiency were at S. cerevisiae:S. stipitis cell ratio of 0.31, agitation rate of 116 rpm and temperature of 33.1°C. The optimized SSCF process reached ethanol titer of 15.2g/L and ethanol yield of 99% of theoretical yield, consistent with the DoE model prediction. Moreover, SSCF process under high biomass concentration resulted in high ethanol concentration of 28.6g/L. This work suggests the efficiency and scalability of the developed SSCF process which could provide an important basis for the economic feasibility of ethanol production from lignocelluloses. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Optimal design of experiments applied to headspace solid phase microextraction for the quantification of vicinal diketones in beer through gas chromatography-mass spectrometric detection.

    Science.gov (United States)

    Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C

    2015-08-05

    Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1).

  13. Embedded Systems Design: Optimization Challenges

    DEFF Research Database (Denmark)

    Pop, Paul

    2005-01-01

    of designing such systems is becoming increasingly important and difficult at the same time. New automated design optimization techniques are needed, which are able to: successfully manage the complexity of embedded systems, meet the constraints imposed by the application domain, shorten the time...... in use has become larger than the number of humans on the planet. The complexity of embedded systems is growing at a very high pace and the constraints in terms of functionality, performance, low energy consumption, reliability, cost and time-to-market are getting tighter. Therefore, the task......-to-market, and reduce development and manufacturing costs. In this paper, the author introduces several embedded systems design problems, and shows how they can be formulated as optimization problems. Solving such challenging design optimization problems are the key to the success of the embedded systems design...

  14. A Study of the Effect of Gold Thickness Distribution in the Jet Plating Process to Optimize Gold Usage and Plating Voltage Using Design of Experiments

    Directory of Open Access Journals (Sweden)

    Aramphongphun Chuckaphun

    2016-01-01

    Full Text Available A gold plating process in the electronics industry can be classified as (i all surface plating or (ii selective plating. Selective plating is more widely used than all surface plating because it can save more gold used in the plating process and takes less plating time. In this research, the selective plating process called jet plating was studied. Factors that possibly affected the gold usage and plating voltage were also studied to reduce the production cost. These factors included (a plating temperature, (b crystal (inhibitor amount, (c distance between workpiece and anode, (d plating current and (e plating speed. A two-level Full Factorial design with center points was first performed to screen the factors. A Central Composite Design (CCD was then employed to optimize the factors in jet plating. The amount of gold usage should be reduced to 0.366 g / 10,000 pieces, the plating speed should be increased to 4 m/min and the plating voltage should not exceed 8.0 V. According to the analysis, the optimal settings should be as follows: the plating temperature at 55.5 deg C, the crystal amount at 90%, the distance at 0.5 mm, the plating current at 2.8 A, and the plating speed at 4.5 m/min. This optimal setting led to gold usage of 0.350 g / 10,000 pieces and a plating voltage of 7.16 V. Confirmation runs of 30 experiments at the optimal conditions were then performed. It was found that the gold usage and the plating voltage of the confirmation runs were not different from the optimized gold usage and plating voltage. The optimal condition was then applied in production, which could reduce the gold usage by 4.5% and increase the plating speed by 12.5% while the plating voltage did not exceed the limit.

  15. Democratic design experiments

    DEFF Research Database (Denmark)

    Ehn, Pelle; Brandt, Eva; Halse, Joachim

    2016-01-01

    we here see design engagements which are both controversial in their commitment to agendas of social change and experimental in the sense that they openly probe for what can possibly be enacted. In this conversation we want to explore how such engagements may be seen as democratic design experiments...

  16. Democratic design experiments

    DEFF Research Database (Denmark)

    Ehn, Pelle; Brandt, Eva; Halse, Joachim

    2016-01-01

    we here see design engagements which are both controversial in their commitment to agendas of social change and experimental in the sense that they openly probe for what can possibly be enacted. In this conversation we want to explore how such engagements may be seen as democratic design experiments...

  17. Optimization-Based Layout Design

    Directory of Open Access Journals (Sweden)

    K. Abdel-Malek

    2005-01-01

    Full Text Available The layout problem is of importance to ergonomists, vehicle/cockpit packaging engineers, designers of manufacturing assembly lines, designers concerned with the placement of levers, knobs, controls, etc. in the reachable workspace of a human, and also to users of digital human modeling code, where digital prototyping has become a valuable tool. This paper proposes a hybrid optimization method (gradient-based optimization and simulated annealing to obtain the layout design. We implemented the proposed algorithm for a project at Oral-B Laboratories, where a manufacturing cell involves an operator who handles three objects, some with the left hand, others with the right hand.

  18. Global optimization of the infrared matrix-assisted laser desorption electrospray ionization (IR MALDESI) source for mass spectrometry using statistical design of experiments.

    Science.gov (United States)

    Barry, Jeremy A; Muddiman, David C

    2011-12-15

    Design of experiments (DOE) is a systematic and cost-effective approach to system optimization by which the effects of multiple parameters and parameter interactions on a given response can be measured in few experiments. Herein, we describe the use of statistical DOE to improve a few of the analytical figures of merit of the infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) source for mass spectrometry. In a typical experiment, bovine cytochrome c was ionized via electrospray, and equine cytochrome c was desorbed and ionized by IR-MALDESI such that the ratio of equine:bovine was used as a measure of the ionization efficiency of IR-MALDESI. This response was used to rank the importance of seven source parameters including flow rate, laser fluence, laser repetition rate, ESI emitter to mass spectrometer inlet distance, sample stage height, sample plate voltage, and the sample to mass spectrometer inlet distance. A screening fractional factorial DOE was conducted to designate which of the seven parameters induced the greatest amount of change in the response. These important parameters (flow rate, stage height, sample to mass spectrometer inlet distance, and laser fluence) were then studied at higher resolution using a full factorial DOE to obtain the globally optimized combination of parameter settings. The optimum combination of settings was then compared with our previously determined settings to quantify the degree of improvement in detection limit. The limit of detection for the optimized conditions was approximately 10 attomoles compared with 100 femtomoles for the previous settings, which corresponds to a four orders of magnitude improvement in the detection limit of equine cytochrome c.

  19. A Designed Experiments Approach to Optimizing MALDI-TOF MS Spectrum Processing Parameters Enhances Detection of Antibiotic Resistance in Campylobacter jejuni

    Directory of Open Access Journals (Sweden)

    Christian ePenny

    2016-05-01

    Full Text Available MALDI-TOF MS has been utilized as a reliable and rapid tool for microbial fingerprinting at the genus and species levels. Recently, there has been keen interest in using MALDI-TOF MS beyond the genus and species levels to rapidly identify antibiotic resistant strains of bacteria. The purpose of this study was to enhance strain level resolution for Campylobacter jejuni through the optimization of spectrum processing parameters using a series of designed experiments. A collection of 172 strains of C. jejuni were collected from Luxembourg, New Zealand, North America, and South Africa, consisting of four groups of antibiotic resistant isolates. The groups included: 1 65 strains resistant to cefoperazone 2 26 resistant to cefoperazone and beta-lactams 3 5 strains resistant to cefoperazone, beta-lactams, and tetracycline, and 4 76 strains resistant to cefoperazone, teicoplanin, amphotericin B and cephalothin. Initially, a model set of 16 strains (three biological replicates and three technical replicates per isolate, yielding a total of 144 spectra of C. jejuni was subjected to each designed experiment to enhance detection of antibiotic resistance. The most optimal parameters were applied to the larger collection of 172 isolates (two biological replicates and three technical replicates per isolate, yielding a total of 1,031 spectra. We observed an increase in antibiotic resistance detection whenever either a curve based similarity coefficient (Pearson or ranked Pearson was applied rather than a peak based (Dice and/or the optimized preprocessing parameters were applied. Increases in antimicrobial resistance detection were scored using the jackknife maximum similarity technique following cluster analysis. From the first four groups of antibiotic resistant isolates, the optimized preprocessing parameters increased detection respective to the aforementioned groups by: 1 five percent 2 nine percent 3 ten percent, and 4 two percent. An additional second

  20. A Designed Experiments Approach to Optimizing MALDI-TOF MS Spectrum Processing Parameters Enhances Detection of Antibiotic Resistance in Campylobacter jejuni.

    Science.gov (United States)

    Penny, Christian; Grothendick, Beau; Zhang, Lin; Borror, Connie M; Barbano, Duane; Cornelius, Angela J; Gilpin, Brent J; Fagerquist, Clifton K; Zaragoza, William J; Jay-Russell, Michele T; Lastovica, Albert J; Ragimbeau, Catherine; Cauchie, Henry-Michel; Sandrin, Todd R

    2016-01-01

    MALDI-TOF MS has been utilized as a reliable and rapid tool for microbial fingerprinting at the genus and species levels. Recently, there has been keen interest in using MALDI-TOF MS beyond the genus and species levels to rapidly identify antibiotic resistant strains of bacteria. The purpose of this study was to enhance strain level resolution for Campylobacter jejuni through the optimization of spectrum processing parameters using a series of designed experiments. A collection of 172 strains of C. jejuni were collected from Luxembourg, New Zealand, North America, and South Africa, consisting of four groups of antibiotic resistant isolates. The groups included: (1) 65 strains resistant to cefoperazone (2) 26 resistant to cefoperazone and beta-lactams (3) 5 strains resistant to cefoperazone, beta-lactams, and tetracycline, and (4) 76 strains resistant to cefoperazone, teicoplanin, amphotericin, B and cephalothin. Initially, a model set of 16 strains (three biological replicates and three technical replicates per isolate, yielding a total of 144 spectra) of C. jejuni was subjected to each designed experiment to enhance detection of antibiotic resistance. The most optimal parameters were applied to the larger collection of 172 isolates (two biological replicates and three technical replicates per isolate, yielding a total of 1,031 spectra). We observed an increase in antibiotic resistance detection whenever either a curve based similarity coefficient (Pearson or ranked Pearson) was applied rather than a peak based (Dice) and/or the optimized preprocessing parameters were applied. Increases in antimicrobial resistance detection were scored using the jackknife maximum similarity technique following cluster analysis. From the first four groups of antibiotic resistant isolates, the optimized preprocessing parameters increased detection respective to the aforementioned groups by: (1) 5% (2) 9% (3) 10%, and (4) 2%. An additional second categorization was created from the

  1. Mullite Plasma Spraying for In Situ Repair of Cracks in Mullite Refractories: Simultaneous Optimization of Porosity and Thickness by Statistical Design of Experiments

    Science.gov (United States)

    Schrijnemakers, A.; Francq, B. G.; Cloots, R.; Vertruyen, B.; Boschini, F.

    2013-10-01

    We report a laboratory-scale study about the suitability of the plasma spraying process for "in situ" repair of cracks in mullite refractories of industrial furnaces. The "design of experiments" approach is used to investigate how the coating porosity and thickness are influenced by six experimental parameters. Arc current, secondary gas (H2) flow rate, and stand-off distance are the most significant parameters for both responses. Several interaction terms also affect significantly the thickness response. The validity of the model equations is discussed both from a statistical point of view and regarding the physical credibility of the main model terms. Additional experiments confirm that the measured properties lie into the prediction intervals provided by the model. Using a set of parameters optimized for minimal porosity and high thickness (relevant for the crack repair application), coatings with 6% porosity and 1070 μm thickness can be prepared reproducibly.

  2. Design optimization for cost and quality: The robust design approach

    Science.gov (United States)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  3. Telemanipulator design and optimization software

    Science.gov (United States)

    Cote, Jean; Pelletier, Michel

    1995-12-01

    For many years, industrial robots have been used to execute specific repetitive tasks. In those cases, the optimal configuration and location of the manipulator only has to be found once. The optimal configuration or position where often found empirically according to the tasks to be performed. In telemanipulation, the nature of the tasks to be executed is much wider and can be very demanding in terms of dexterity and workspace. The position/orientation of the robot's base could be required to move during the execution of a task. At present, the choice of the initial position of the teleoperator is usually found empirically which can be sufficient in the case of an easy or repetitive task. In the converse situation, the amount of time wasted to move the teleoperator support platform has to be taken into account during the execution of the task. Automatic optimization of the position/orientation of the platform or a better designed robot configuration could minimize these movements and save time. This paper will present two algorithms. The first algorithm is used to optimize the position and orientation of a given manipulator (or manipulators) with respect to the environment on which a task has to be executed. The second algorithm is used to optimize the position or the kinematic configuration of a robot. For this purpose, the tasks to be executed are digitized using a position/orientation measurement system and a compact representation based on special octrees. Given a digitized task, the optimal position or Denavit-Hartenberg configuration of the manipulator can be obtained numerically. Constraints on the robot design can also be taken into account. A graphical interface has been designed to facilitate the use of the two optimization algorithms.

  4. Heat Sink Design and Optimization

    Science.gov (United States)

    2015-12-01

    Natural convection Radiation Design Modeling Optimization 16. SECURITY CLASSIFICATION OF: 17...Hs = 3.94 in.  Width Ws = 5.42 in.  Fins  Height Hf = 0.98 in.  Length...different fin thicknesses (tf) The next parameter considered was fin height, Hf . Smaller height has a negative influence on overall heat sink

  5. System deployment optimization in architecture design

    Institute of Scientific and Technical Information of China (English)

    Xiaoxue Zhang; Shu Tang; Aimin Luo; Xueshan Luo

    2014-01-01

    Optimization of architecture design has recently drawn research interest. System deployment optimization (SDO) refers to the process of optimizing systems that are being deployed to activi-ties. This paper first formulates a mathematical model to theorize and operationalize the SDO problem and then identifies optimal so-lutions to solve the SDO problem. In the solutions, the success rate of the combat task is maximized, whereas the execution time of the task and the cost of changes in the system structure are mini-mized. The presented optimized algorithm generates an optimal solution without the need to check the entire search space. A novel method is final y proposed based on the combination of heuristic method and genetic algorithm (HGA), as wel as the combination of heuristic method and particle swarm optimization (HPSO). Experi-ment results show that the HPSO method generates solutions faster than particle swarm optimization (PSO) and genetic algo-rithm (GA) in terms of execution time and performs more efficiently than the heuristic method in terms of determining the best solution.

  6. Research on Design Optimization Strategy in Virtual Product Development

    Institute of Scientific and Technical Information of China (English)

    潘军; 韩帮军; 范秀敏; 马登哲

    2004-01-01

    Simulation and optimization are the key points of virtual product development (VPD). Traditional engineering simulation software and optimization methods are inadequate to analyze the optimization problems because of its computational inefficiency. A systematic design optimization strategy by using statistical methods and mathematical optimization technologies is proposed. This method extends the design of experiments (DOE) and the simulation metamodel technologies. Metamodels are built to in place of detailed simulation codes based on effectively DOE, and then be linked to optimization routines for fast analysis, or serve as a bridge for integrating simulation software across different domains. A design optimization of composite material structure is used to demonstrate the newly introduced methodology.

  7. Partial least squares model and design of experiments toward the analysis of the metabolome of Jatropha gossypifolia leaves: Extraction and chromatographic fingerprint optimization.

    Science.gov (United States)

    Pilon, Alan Cesar; Carnevale Neto, Fausto; Freire, Rafael Teixeira; Cardoso, Patrícia; Carneiro, Renato Lajarim; Da Silva Bolzani, Vanderlan; Castro-Gamboa, Ian

    2016-03-01

    A major challenge in metabolomic studies is how to extract and analyze an entire metabolome. So far, no single method was able to clearly complete this task in an efficient and reproducible way. In this work we proposed a sequential strategy for the extraction and chromatographic separation of metabolites from leaves Jatropha gossypifolia using a design of experiments and partial least square model. The effect of 14 different solvents on extraction process was evaluated and an optimized separation condition on liquid chromatography was estimated considering mobile phase composition and analysis time. The initial conditions of extraction using methanol and separation in 30 min between 5 and 100% water/methanol (1:1 v/v) with 0.1% of acetic acid, 20 μL sample volume, 3.0 mL min(-1) flow rate and 25°C column temperature led to 107 chromatographic peaks. After the optimization strategy using i-propanol/chloroform (1:1 v/v) for extraction, linear gradient elution of 60 min between 5 and 100% water/(acetonitrile/methanol 68:32 v/v with 0.1% of acetic acid), 30 μL sample volume, 2.0 mL min(-1) flow rate, and 30°C column temperature, we detected 140 chromatographic peaks, 30.84% more peaks compared to initial method. This is a reliable strategy using a limited number of experiments for metabolomics protocols.

  8. Design of a video system providing optimal visual information for controlling payload and experiment operations with television

    Science.gov (United States)

    1975-01-01

    A program was conducted which included the design of a set of simplified simulation tasks, design of apparatus and breadboard TV equipment for task performance, and the implementation of a number of simulation tests. Performance measurements were made under controlled conditions and the results analyzed to permit evaluation of the relative merits (effectivity) of various TV systems. Burden factors were subsequently generated for each TV system to permit tradeoff evaluation of system characteristics against performance. For the general remote operation mission, the 2-view system is recommended. This system is characterized and the corresponding equipment specifications were generated.

  9. Using Design of Experiments and Response Surface Methodology as an Approach to Understand and Optimize Operational Air Power

    Science.gov (United States)

    2010-06-01

    Bendel, A. (1988), "Introduction to Taguchi Methodology," Taguchi Methods : Proceedings of the 1988 European Conference, London, Elsevier Applied Science...Progress, pp. 74-75, April 1990 Kackar, R.N. (1985), "Off-Line Quality Control, Parameter Design, and the Taguchi Method ," Journal of Quality...Ranjit R. (1990), A Primer on the Taguchi Method , New York, Van Nostrand Reinhold Unal, R., Stanley, D.O. and Joyner, R. (1993) , Propulsion

  10. Design optimization of deployable wings

    Science.gov (United States)

    Gaddam, Pradeep

    Morphing technology is an important aspect of UAV design, particularly in regards to deployable systems. The design of such system has an important impact on the vehicle's performance. The primary focus of the present research work was to determine the most optimum deployable wing design from 3 competing designs and develop one of the deployable wing designs to test in the research facility. A Matlab code was developed to optimize 3 deployable wing concepts inflatable, inflatable telescopic and rigid-folding wings based on a sequential optimization strategy. The constraints that were part of the code include the packaging constraints during its stowed state, fixed length of the deployed section and the minimum L/D constraint. This code resulted in determining the optimum weight of all the 3 designs, the most optimum weight design is the inflatable wing design. This is a result of the flexible skin material and also due to no rigid parts in the deployed wing section. Another goal of the research involved developing an inflatable telescopic wing. The prototype was tested in a wind tunnel, while the actual wing was tested in the altitude chamber to determine the deployment speed, input pressure, analyze and predict the deployment sequence and behavior of the wing at such high wind speeds and altitudes ranging from 60,000 ft to 90,000 ft. Results from these tests allowed us to conclude the deployment sequence of the telescopic wing followed from the root to the tip section. The results were used to analyze the deployment time of the wing. As expected the deployment time decreased with an increase in input pressure. The results also show us that as the altitude increases, the deployment speed of the wing also increased. This was demonstrated when the wing was tested at a maximum altitude pressure of 90,000ft, well above the design altitude of 60,000ft.

  11. Optimization of supercoiled HPV-16 E6/E7 plasmid DNA purification with arginine monolith using design of experiments.

    Science.gov (United States)

    Almeida, A M; Queiroz, J A; Sousa, F; Sousa, A

    2015-01-26

    The progress of DNA vaccines is dependent on the development of suitable chromatographic procedures to successfully purify genetic vectors, such as plasmid DNA. Human Papillomavirus is associated with the development of tumours due to the oncogenic power of E6 and E7 proteins, produced by this virus. The supercoiled HPV-16 E6/E7 plasmid-based vaccine was recently purified with the arginine monolith, with 100% of purity, but only 39% of recovery was achieved. Therefore, the present study describes the application of experimental design tools, a newly explored methodology in preparative chromatography, in order to improve the supercoiled plasmid DNA recovery with the arginine monolith, maintaining the high purity degree. In addition, the importance and influence of pH in the pDNA retention to the arginine ligand was also demonstrated. The Composite Central Face design was validated and the recovery of the target molecule was successfully improved from 39% to 83.5%, with an outstanding increase of more than double, while maintaining 100% of purity.

  12. Optimal Design of Stiffeners for Bucket Foundations

    OpenAIRE

    Courtney, William Tucker; Stolpe, Mathias; Buhl, Thomas; Bitsche, Robert; Hallum, Nicolai; Nielsen, Søren A.

    2015-01-01

    The potential for structural optimization of the bucket foundation’s outer stiffeners is investigated using commercial optimization software. In order to obtain the optimal design both shape and topology optimization problems are formulated and solved using the structural optimization software Tosca Structure coupled with the finite element software Abaqus. The solutions to these optimization problems are then manually interpreted as a new design concept. Results show that shape optimization ...

  13. An Investigation of the Optimal Cutting Conditions in Coconut Wood and Palmyra Palm Wood Turning Process Using Design of Experiment

    Directory of Open Access Journals (Sweden)

    Surasit RAWANGWONG

    2013-06-01

    Full Text Available The purpose of this research is to investigate the effect of factors on the surface roughness in the coconut wood and palmyra palm wood turning process for manufacturing furniture parts using carbide cutting tools. The main factors, namely, cutting speed, feed rate and depth of cut were investigated for the optimum surface roughness in furniture manufacturing process. Normally, an acceptable surface roughness is between 3.0 - 9.0 µm before the sanding process. The result of preliminary trial shows that the depth of cut had no effect on surface roughness. Moreover, it was found from the experiment that the factors affecting surface roughness were cutting speed and feed rate, with a tendency for reduction of roughness value at lower feed rate and greater cutting speed. Therefore, in the turning process of coconut wood, it was possible to determine a cutting condition by means of the equation Ra = 3.90 - 0.00375 Cutting Speed+7.93 Feed Rate, This equation can be best used with a limitation of cutting speed at 170 - 353 m/min, feed rate at 0.05 - 0.16 mm/rev and depth of cut at 1 mm. In the turning process of palmyra palm wood, equation Ra = 3.38 - 0.00164 Cutting Speed+10.8 Feed Rate, This equation can be best used with a limitation of cutting speed at 188 - 392 m/min, feed rate at 0.05 - 0.12 mm/rev and maximum depth of cut at 1 mm. To confirm the experiment result, a comparison between the equation value and an actual value by estimating a prediction error value was calculated with the surface roughness and margin of error not over 10 %. The result from the experiment of mean absolute percentage error of the equation of surface roughness is 4.28 % for coconut wood and 3.47 % for palmyra palm wood, which is less than the predicted error value and is acceptable.

  14. Analysis and optimization of saturation transfer difference NMR experiments designed to map early self-association events in amyloidogenic peptides.

    Science.gov (United States)

    Huang, Hao; Milojevic, Julijana; Melacini, Giuseppe

    2008-05-08

    Saturation transfer difference (STD) methods recently have been proposed to be a promising tool for self-recognition mapping at residue and atomic resolution in amyloidogenic peptides. Despite the significant potential of the STD approach for systems undergoing oligomer/monomer (O/M) equilibria, a systematic analysis of the possible artifacts arising in this novel application of STD experiments is still lacking. Here, we have analyzed the STD method as applied to O/M peptides, and we have identified three major sources of possible biases: offset effects, intramonomer cross-relaxation, and partial spin-diffusion within the oligomers. For the purpose of quantitatively assessing these artifacts, we employed a comparative approach that relies on 1-D and 2-D STD data acquired at different saturation frequencies on samples with different peptide concentrations and filtration states. This artifact evaluation protocol was applied to the Abeta(12-28) model system, and all three types of artifacts appear to affect the measured STD spectra. In addition, we propose a method to minimize the biases introduced by these artifacts in the Halpha STD distributions used to obtain peptide self-recognition maps at residue resolution. This method relies on the averaging of STD data sets acquired at different saturation frequencies and provides results comparable to those independently obtained through other NMR pulse sequences that probe oligomerization, such as nonselective off-resonance relaxation experiments. The artifact evaluation protocol and the multiple frequencies averaging strategy proposed here are of general utility for the growing family of amyloidogenic peptides, as they provide a reliable analysis of STD spectra in terms of polypeptide self-recognition epitopes.

  15. Particle Swarm Optimization for Outdoor Lighting Design

    Directory of Open Access Journals (Sweden)

    Ana Castillo-Martinez

    2017-01-01

    Full Text Available Outdoor lighting is an essential service for modern life. However, the high influence of this type of facility on energy consumption makes it necessary to take extra care in the design phase. Therefore, this manuscript describes an algorithm to help light designers to get, in an easy way, the best configuration parameters and to improve energy efficiency, while ensuring a minimum level of overall uniformity. To make this possible, we used a particle swarm optimization (PSO algorithm. These algorithms are well established, and are simple and effective to solve optimization problems. To take into account the most influential parameters on lighting and energy efficiency, 500 simulations were performed using DIALux software (4.10.0.2, DIAL, Ludenscheid, Germany. Next, the relation between these parameters was studied using to data mining software. Subsequently, we conducted two experiments for setting parameters that enabled the best configuration algorithm in order to improve efficiency in the proposed process optimization.

  16. Study of optimization condition for spin coating of the photoresist film on rectangular substrate by Taguchi design of an experiment

    Directory of Open Access Journals (Sweden)

    Nithi Atthi

    2009-08-01

    Full Text Available There are four parameters concerning the spin coating of a positive photoresist film. This paper focuses on spin coating of the positive photoresist Clariantz AZ-P4620 on a 2x7 cm rectangular substrate. By ways of Taguchi L16 (44 method, the number of experiments can be reduced from 256 to 16. By analyzing the main impact plot of the signal to noise ratio, it is found that the most suitable values of the four parameters giving the desired thickness and uniformity is a photoresist dispense time of 13 seconds, then spin at a speed of 700 rpm for 5 seconds, and then accelerate at 2,000 rpm per seconds to 4,000 rpm. The speed is maintained at 4,000 rpm for 60 seconds with an exhaust pressure of 300 Pa. The substrate is later baked at 100 oCfor 90 seconds. The calculated thickness of the final film is 48,107.70±1,096 Angstroms. The analysis of the deviation showsthat no parameter has a significant on the thickness and uniformity of the final photoresist film with a confidence level of 95%. This DOE can be used in many applications in the micro and nano fabrication industry.

  17. Fire Risk Analysis and Optimization of Fire Prevention Management for Green Building Design and High Rise Buildings: Hong Kong Experience

    Directory of Open Access Journals (Sweden)

    Yau Albert

    2014-12-01

    Full Text Available There are many iconic high rise buildings in Hong Kong, for example, International Commercial Centre, International Financial Centre, etc. Fire safety issue in high rise buildings has been raised by local fire professionals in terms of occupant evacuation, means of fire-fighting by fire fighters, sprinkler systems to automatically put off fires in buildings, etc. Fire risk becomes an important issue in building fire safety because it relates to life safety of building occupants where they live and work in high rise buildings in Hong Kong. The aim of this research is to identify the fire risk for different types of high rise buildings in Hong Kong and to optimise the fire prevention management for those high rise buildings with higher level of fire risk and to validate the model and also to carry out the study of the conflict between the current fire safety building code and the current trend of green building design. Survey via the 7-point scale questionnaire was conducted through 50 participants and their responses were received and analysed via the statistical tool SPSS software computer program. A number of statistical methods of testing for significantly difference in samples were adopted to carry out the analysis of the data received. When the statistical analysis was completed, the results of the data analysis were validated by two Fire Safety Experts in this area of specialisation and also by quantitative fire risk analysis.

  18. Optimally designing games for behavioural research.

    Science.gov (United States)

    Rafferty, Anna N; Zaharia, Matei; Griffiths, Thomas L

    2014-07-08

    Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision.

  19. Designing learning experiences.

    Science.gov (United States)

    Collins, Jannette

    2007-01-01

    Creation of significant learning experiences follows basic steps of instructional design related to situational factors, goals and objectives, feedback and evaluation methods, teaching and learning activities, alignment of the preceding elements, and course evaluation. Goals should reflect what students will learn at the end of the course and what will still be with them several years later. Objectives should focus on learner performance, not teacher performance, and on behavior, not subject matter; there should be only one learning outcome per objective. Students learn more and retain their knowledge longer if they acquire it in an active rather than a passive manner. The situational factors, goals and objectives, feedback and evaluation, and teaching and learning activities should all reflect and support each other. The act of course evaluation closes the educational loop of design, implement, evaluate, and modify. (c) RSNA, 2007.

  20. Design optimization of solar cooker

    Energy Technology Data Exchange (ETDEWEB)

    Mirdha, U.S.; Dhariwal, S.R. [Department of Physics, Jai Narain Vyas University, Jodhpur 342 005 (India)

    2008-03-15

    Various designs of solar cookers have been theoretically investigated with a view to optimize their performance. Starting from a conventional box type cooker, various combinations of booster mirrors have been studied to arrive at a final design, aimed at providing a cooker, which can be fixed on a south facing window (for countries of northern hemisphere, mainly situated near the tropic of Cancer). This cooker, with a rear window opening, may provide higher cooking temperature for a fairly large duration of the day. Two or three changes in positions of the side booster mirrors, without moving the cooker as a whole has been proposed. The new design has been experimentally implemented and compared with a conventional box type solar cooker. Besides the convenience of a rear window opening, the cooker provides temperatures sufficiently high to enable cooking two meals a day. (author)

  1. Identification of IL-1β and LPS as optimal activators of monolayer and alginate-encapsulated mesenchymal stromal cell immunomodulation using design of experiments and statistical methods.

    Science.gov (United States)

    Gray, Andrea; Maguire, Timothy; Schloss, Rene; Yarmush, Martin L

    2015-01-01

    Induction of therapeutic mesenchymal stromal cell (MSC) function is dependent upon activating factors present in diseased or injured tissue microenvironments. These functions include modulation of macrophage phenotype via secreted molecules including prostaglandin E2 (PGE2). Many approaches aim to optimize MSC-based therapies, including preconditioning using soluble factors and cell immobilization in biomaterials. However, optimization of MSC function is usually inefficient as only a few factors are manipulated in parallel. We utilized fractional factorial design of experiments to screen a panel of 6 molecules (lipopolysaccharide [LPS], polyinosinic-polycytidylic acid [poly(I:C)], interleukin [IL]-6, IL-1β, interferon [IFN]-β, and IFN-γ), individually and in combinations, for the upregulation of MSC PGE2 secretion and attenuation of macrophage secretion of tumor necrosis factor (TNF)-α, a pro-inflammatory molecule, by activated-MSC conditioned medium (CM). We used multivariable linear regression (MLR) and analysis of covariance to determine differences in functions of optimal factors on monolayer MSCs and alginate-encapsulated MSCs (eMSCs). The screen revealed that LPS and IL-1β potently activated monolayer MSCs to enhance PGE2 production and attenuate macrophage TNF-α. Activation by LPS and IL-1β together synergistically increased MSC PGE2, but did not synergistically reduce macrophage TNF-α. MLR and covariate analysis revealed that macrophage TNF-α was strongly dependent on the MSC activation factor, PGE2 level, and macrophage donor but not MSC culture format (monolayer versus encapsulated). The results demonstrate the feasibility and utility of using statistical approaches for higher throughput cell analysis. This approach can be extended to develop activation schemes to maximize MSC and MSC-biomaterial functions prior to transplantation to improve MSC therapies. © 2015 American Institute of Chemical Engineers.

  2. Circadian clocks are designed optimally

    CERN Document Server

    Hasegawa, Yoshihiko

    2014-01-01

    Circadian rhythms are acquired through evolution to increase the chances for survival by synchronizing to the daylight cycle. Reliable synchronization is realized through two trade-off properties: regularity to keep time precisely, and entrainability to synchronize the internal time with daylight. Since both properties have been tuned through natural selection, their adaptation can be formalized in the framework of mathematical optimization. By using a succinct model, we found that simultaneous optimization of regularity and entrainability entails inherent features of the circadian mechanism irrespective of model details. At the behavioral level we discovered the existence of a dead zone, a time during which light pulses neither advance nor delay the clock. At the molecular level we demonstrate the role-sharing of two light inputs, phase advance and delay, as is well observed in mammals. We also reproduce the results of phase-controlling experiments and predict molecular elements responsible for the clockwork...

  3. Real Life Experiences with Experience Design

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Halskov, Kim

    2006-01-01

    Experience Design is an emergent field of study, and various approaches to the field abound. In this paper, we take a pragmatic approach to identifying key aspects of an experience design process, by reporting on a project involving the design of experience-oriented applications of interactive...... technologies for knowledge dissemination and marketing, in cooperation with public institutions and businesses. We argue that collaborative formulation of core design intentions and values is a valuable instrument in guiding experience design processes, and present three cases from this project, two of which...... the installations, the core values established to guide the design process and the intended use contexts. We argue that the installations present a broad spectrum of experience design installations that can assist designers in understanding the relations between core values, intentions, use context and interface...

  4. Optimization of confocal scanning laser ophthalmoscope design.

    Science.gov (United States)

    LaRocca, Francesco; Dhalla, Al-Hafeez; Kelly, Michael P; Farsiu, Sina; Izatt, Joseph A

    2013-07-01

    Confocal scanning laser ophthalmoscopy (cSLO) enables high-resolution and high-contrast imaging of the retina by employing spatial filtering for scattered light rejection. However, to obtain optimized image quality, one must design the cSLO around scanner technology limitations and minimize the effects of ocular aberrations and imaging artifacts. We describe a cSLO design methodology resulting in a simple, relatively inexpensive, and compact lens-based cSLO design optimized to balance resolution and throughput for a 20-deg field of view (FOV) with minimal imaging artifacts. We tested the imaging capabilities of our cSLO design with an experimental setup from which we obtained fast and high signal-to-noise ratio (SNR) retinal images. At lower FOVs, we were able to visualize parafoveal cone photoreceptors and nerve fiber bundles even without the use of adaptive optics. Through an experiment comparing our optimized cSLO design to a commercial cSLO system, we show that our design demonstrates a significant improvement in both image quality and resolution.

  5. Optimal Design of Stiffeners for Bucket Foundations

    DEFF Research Database (Denmark)

    Courtney, William Tucker; Stolpe, Mathias; Buhl, Thomas;

    2015-01-01

    The potential for structural optimization of the bucket foundation’s outer stiffeners is investigated using commercial optimization software. In order to obtain the optimal design both shape and topology optimization problems are formulated and solved using the structural optimization software...... Tosca Structure coupled with the finite element software Abaqus. The solutions to these optimization problems are then manually interpreted as a new design concept. Results show that shape optimization of the initial design can reduce stress concentrations by 38%. Additionally, topology optimization has...

  6. Optimization of a serum-free culture medium for mouse embryonic stem cells using design of experiments (DoE) methodology.

    Science.gov (United States)

    Knöspel, Fanny; Schindler, Rudolf K; Lübberstedt, Marc; Petzolt, Stephanie; Gerlach, Jörg C; Zeilinger, Katrin

    2010-12-01

    The in vitro culture behaviour of embryonic stem cells (ESC) is strongly influenced by the culture conditions. Current culture media for expansion of ESC contain some undefined substances. Considering potential clinical translation work with such cells, the use of defined media is desirable. We have used Design of Experiments (DoE) methods to investigate the composition of a serum-free chemically defined culture medium for expansion of mouse embryonic stem cells (mESC). Factor screening analysis according to Plackett-Burman revealed that insulin and leukaemia inhibitory factor (LIF) had a significant positive influence on the proliferation activity of the cells, while zinc and L: -cysteine reduced the cell growth. Further analysis using minimum run resolution IV (MinRes IV) design indicates that following factor adjustment LIF becomes the main factor for the survival and proliferation of mESC. In conclusion, DoE screening assays are applicable to develop and to refine culture media for stem cells and could also be employed to optimize culture media for human embryonic stem cells (hESC).

  7. Sequential Design of Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Anderson-Cook, Christine Michaela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-30

    A sequential design of experiments strategy is being developed and implemented that allows for adaptive learning based on incoming results as the experiment is being run. The plan is to incorporate these strategies for the NCCC and TCM experimental campaigns to be run in the coming months. This strategy for experimentation has the advantages of allowing new data collected during the experiment to inform future experimental runs based on their projected utility for a particular goal. For example, the current effort for the MEA capture system at NCCC plans to focus on maximally improving the quality of prediction of CO2 capture efficiency as measured by the width of the confidence interval for the underlying response surface that is modeled as a function of 1) Flue Gas Flowrate [1000-3000] kg/hr; 2) CO2 weight fraction [0.125-0.175]; 3) Lean solvent loading [0.1-0.3], and; 4) Lean solvent flowrate [3000-12000] kg/hr.

  8. Simultaneous optimal experimental design for in vitro binding parameter estimation.

    Science.gov (United States)

    Ernest, C Steven; Karlsson, Mats O; Hooker, Andrew C

    2013-10-01

    Simultaneous optimization of in vitro ligand binding studies using an optimal design software package that can incorporate multiple design variables through non-linear mixed effect models and provide a general optimized design regardless of the binding site capacity and relative binding rates for a two binding system. Experimental design optimization was employed with D- and ED-optimality using PopED 2.8 including commonly encountered factors during experimentation (residual error, between experiment variability and non-specific binding) for in vitro ligand binding experiments: association, dissociation, equilibrium and non-specific binding experiments. Moreover, a method for optimizing several design parameters (ligand concentrations, measurement times and total number of samples) was examined. With changes in relative binding site density and relative binding rates, different measurement times and ligand concentrations were needed to provide precise estimation of binding parameters. However, using optimized design variables, significant reductions in number of samples provided as good or better precision of the parameter estimates compared to the original extensive sampling design. Employing ED-optimality led to a general experimental design regardless of the relative binding site density and relative binding rates. Precision of the parameter estimates were as good as the extensive sampling design for most parameters and better for the poorly estimated parameters. Optimized designs for in vitro ligand binding studies provided robust parameter estimation while allowing more efficient and cost effective experimentation by reducing the measurement times and separate ligand concentrations required and in some cases, the total number of samples.

  9. Design of microfluidic bioreactors using topology optimization

    DEFF Research Database (Denmark)

    Okkels, Fridolin; Bruus, Henrik

    2007-01-01

    We address the design of optimal reactors for supporting biological cultures using the method of topology optimization. For some years this method have been used to design various optimal microfluidic devices.1-4 We apply this method to distribute optimally biologic cultures within a flow of nutr...

  10. Multidisciplinary Design Optimization on Conceptual Design of Aero-engine

    Science.gov (United States)

    Zhang, Xiao-bo; Wang, Zhan-xue; Zhou, Li; Liu, Zeng-wen

    2016-06-01

    In order to obtain better integrated performance of aero-engine during the conceptual design stage, multiple disciplines such as aerodynamics, structure, weight, and aircraft mission are required. Unfortunately, the couplings between these disciplines make it difficult to model or solve by conventional method. MDO (Multidisciplinary Design Optimization) methodology which can well deal with couplings of disciplines is considered to solve this coupled problem. Approximation method, optimization method, coordination method, and modeling method for MDO framework are deeply analyzed. For obtaining the more efficient MDO framework, an improved CSSO (Concurrent Subspace Optimization) strategy which is based on DOE (Design Of Experiment) and RSM (Response Surface Model) methods is proposed in this paper; and an improved DE (Differential Evolution) algorithm is recommended to solve the system-level and discipline-level optimization problems in MDO framework. The improved CSSO strategy and DE algorithm are evaluated by utilizing the numerical test problem. The result shows that the efficiency of improved methods proposed by this paper is significantly increased. The coupled problem of VCE (Variable Cycle Engine) conceptual design is solved by utilizing improved CSSO strategy, and the design parameter given by improved CSSO strategy is better than the original one. The integrated performance of VCE is significantly improved.

  11. Optimal designs for the Michaelis Menten model with correlated observations

    OpenAIRE

    Dette, Holger; Kunert, Joachim

    2012-01-01

    In this paper we investigate the problem of designing experiments for weighted least squares analysis in the Michaelis Menten model. We study the structure of exact D-optimal designs in a model with an autoregressive error structure. Explicit results for locally D-optimal are derived for the case where 2 observations can be taken per subject. Additionally standardized maximin D-optimal designs are obtained in this case. The results illustrate the enormous difficulties to find e...

  12. LOCATORS OPTIMIZATION FOR MEASURING FIXTURE DESIGN

    Institute of Scientific and Technical Information of China (English)

    Wang Jian; Zhou Jiangqi; Lin Zhongqin

    2004-01-01

    "N-2-1" principle is widely recognized in the fixture design for deformable sheet metal workpieces, where N, the locators on primary datum, is the key to sheet metal fixture design. However, little research is done on how to determine the positions and the number of N locators. In practice, the N locators are frequently designed from experience, which is often unsatisfactory for achieving the precision requirement in fixture design. A new method to lay out the N locators for measuring fixture of deformable sheet metal workpiece is presented, given the fixed number of N. Finite-element method is used to model and analysis the deformation of different locator layouts. A knowledge based genetic algorithm (KBGA) is applied to identify the optimum locator layout for measuring fixture design. An example of a door outer is used to verify the optimization approach.

  13. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... that the sources of variation of the simulation model can be divided in two components corresponding to changes in the environmental factors (the uncontrollable factor settings) and to random variation. Moreover, the structure of the environmental effects can be estimated, which can be used to put the system...... in a more robust operating mode. The interpolation technique called Kriging is the topic of Paper F, which is a widely applied technique for building so called models-for-the-model (metamodels). We propose a method that handles both qualitative and quantitative factors, which is not covered by the standard...

  14. Design -|+ Negative emotions for positive experiences

    NARCIS (Netherlands)

    Fokkinga, S.F.

    2015-01-01

    Experience-driven design considers all aspects of a product – its appearance, cultural meaning, functionality, interaction, usability, technology, and indirect consequences of use – with the aim to optimize and orchestrate all these aspects and create the best possible user experience. Since the lat

  15. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    Science.gov (United States)

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  16. Anaerobic co-digestion of waste activated sludge and greasy sludge from flotation process: Batch versus CSTR experiments to investigate optimal design

    OpenAIRE

    Girault, R.; Bridoux, G.; Nauleau, F.; Poullain, C.; Buffet, J.; Peu, P.; Sadowski, A.G.; Béline, F.

    2012-01-01

    In this study, the maximum ratio of greasy sluvdge to incorporate with waste activated sludge was investigated in batch and CSTR experiments. In batch experiments, inhibition occurred with a greasy sludge ratio of more than 20-30% of the feed COD. In CSTR experiments, the optimal greasy sludge ratio was 60% of the feed COD and inhibition occurred above a ratio of 80%. Hence, batch experiments can predict the CSTR yield when the degradation phenomenon are additive but cannot be used to determi...

  17. User Experience Design in Webpage Design

    Institute of Scientific and Technical Information of China (English)

    许钰伟

    2014-01-01

    User experience design is becoming more and more important. It originates from network as well as screen products interactive with users, whose scope is very wide. Although research orientations are different, their objective is the same, that is, to make users enjoy the simplest usage mode and reduce thinking in use; Although there is uncertainty, for some specific user group, the similarity of user experience can be recognized by excellent design;Although user experience is at the initial stage, with higher demand of comfort degree in surfing the internet, user experience design will be the core of future website design.

  18. Optimal design of capacitor-driven coilgun

    Science.gov (United States)

    Kim, Seog-Whan; Jung, Hyun-Kyo; Hahn, Song-Yop

    1994-03-01

    This paper presents an analysis and optimal design of a capacitor-driven inductive coilgun. An equivalent circuit is used for a launch simulation of the coilgun. The circuit equations are solved together with the equation of motion of the projectile by using the Runge-Kutta method. The numerical results are compared with the experimental values to verify the usefulness of the developed simulation program. It is shown that the numerical and the experimental results are in a good agreement. In the design of the system the optimization is achieved by employing the genetic algorithm. The resultant specifications of the coilgun optimally designed by the proposed algorithm are tested by experiment. Finally the obtained results are compared with those designed by approximate equations and by linear search methods as well. It is found that the proposed algorithm gives a better result in the energy efficiency of the system, namely it enables one to obtain a higher muzzle velocity of the projectile with the same amount of energy.

  19. Design Experiments in Educational Research.

    Science.gov (United States)

    Cobb, Paul; Confrey, Jere; diSessa, Andrea; Lehrer, Richard; Schauble, Leona

    2003-01-01

    Indicates the range of purposes and variety of settings in which design experiments have been conducted, delineating five crosscutting features that collectively differentiate design experiments from other methodologies. Clarifies what is involved in preparing for and carrying out a design experiment and in conducting a retrospective analysis of…

  20. Optimization of biogas production from Sargassum sp. using a design of experiments to assess the co-digestion with glycerol and waste frying oil.

    Science.gov (United States)

    Oliveira, J V; Alves, M M; Costa, J C

    2015-01-01

    A design of experiments was adopted to assess the optimal conditions for methane production from the macroalgae Sargassum sp. co-digested with glycerol (Gly) and waste frying oil (WFO). Three variables were tested: % total solids of algae (%TSSargassumsp.), co-substrate concentration (gGly/WFOL(-1)), and co-substrate type (Gly or WFO). The biochemical methane potential (BMP) of Sargassum sp. was 181±1L CH4kg(-1) COD. The co-digestion with Gly and WFO increased the BMP by 56% and 46%, respectively. The methane production rate (k), showed similar behaviour as the BMP, increasing 38% and 19% with Gly and WFO, respectively. The higher BMP (283±18L CH4kg(-1) COD) and k (65.9±2.1L CH4kg(-1) CODd(-1)) was obtained in the assay with 0.5% TS and 3.0gGlyL(-1). Co-digestion with glycerol or WFO is a promising process to enhance the BMP from the macroalgae Sargassum sp.

  1. Aluminium-12wt% silicon coating prepared by thermal spraying technique: Part 1 optimization of spray condition based on a design of experiment

    Directory of Open Access Journals (Sweden)

    Jiansirisomboon, S.

    2006-03-01

    Full Text Available At present, thermal spray technology is used for maintenance parts of various machines in many industries. This technology can be used to improve the surface wear resistance. Therefore, this technology can significantly reduce cost of manufacturing. Al-12wt%Si alloy is an interesting and popular material used in the automotive industry. This research studies the suitable condition for spraying of Al-12wt%Si powder. This powder was sprayed by a flame spray technique onto low carbon steel substrates. The suitable conditions for spraying can be achieved by a design of experiment (DOE principle, which provided statistical data defined at 90% confidence. This research used control factors, which were oxygen flow rate, acetylene flow rate and spray distance. The satisfaction levels of these factors were set at 3 levels, i.e. low, medium and high, in order to determine suitable responses, which were hardness, thickness, wear rate and percentage volume fraction of porosity. It was found that the optimized condition for spraying Al-12wt%Si powder consisted of 38 ft3/hr (1.026 m3/hr of oxygen flow rate, 27 ft3/hr (0.729 m3/hr of acetylene flow rate and 58 mm of spray distance.

  2. Optimizing Leisure Experience After 40

    Directory of Open Access Journals (Sweden)

    Kleiber, Douglas A.

    2012-04-01

    Full Text Available Aging is a natural process that occurs across the lifespan, but to suggest that some people age more successfully than others is to invoke some criteria of living well in later life and then to consider the factors that may contribute. This article puts leisure front and center in the aging process, especially after midlife, and identifies experiential aspects of leisure that may be most influential. But it departs from some of the standard models of successful aging in recognizing opportunities and possibilities even for those with disabling conditions, in reaching back earlier in adulthood for critical incidents that may prove influential, and in considering disengagement as well as engagement as processes and experiences for optimizing leisure and thus aging. Relying more on models of adaptation and selectivity from developmental psychologists Paul and Margaret Baltes and Leah Carstenson, social and civic engagement are considered as well.

    Envejecer es un proceso natural que ocurre a lo largo de toda la vida; no obstante, sugerir que algunas personas envejecen más exitosamente que otras es advertir ciertos criterios para vivir bien al final de la vida y, acto seguido, considerar los factores que contribuyen a ello. En este artículo se sitúa al ocio como frente y eje central del proceso de envejecimiento –especialmente tras la mediana edad–, identificándose aquellos aspectos de la experiencia del ocio que pueden ser más influyentes. En este sentido, se parte de los estándares provistos por modelos de envejecimiento exitoso, reconociéndose sus oportunidades y posibilidades (incluso para quienes parten con desventaja en este proceso, indagando en las experiencias críticas que han podido ser más influyentes durante el período adulto inmediatamente anterior, y considerando el implicarse –o no– como procesos y experiencias que optimizan el ocio y, por extensión, el envejecimiento. Asimismo, se atiende a la participación c

  3. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    Science.gov (United States)

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization.

  4. Optimizing Chromatographic Separation: An Experiment Using an HPLC Simulator

    Science.gov (United States)

    Shalliker, R. A.; Kayillo, S.; Dennis, G. R.

    2008-01-01

    Optimization of a chromatographic separation within the time constraints of a laboratory session is practically impossible. However, by employing a HPLC simulator, experiments can be designed that allow students to develop an appreciation of the complexities involved in optimization procedures. In the present exercise, a HPLC simulator from "JCE…

  5. Optimizing Chromatographic Separation: An Experiment Using an HPLC Simulator

    Science.gov (United States)

    Shalliker, R. A.; Kayillo, S.; Dennis, G. R.

    2008-01-01

    Optimization of a chromatographic separation within the time constraints of a laboratory session is practically impossible. However, by employing a HPLC simulator, experiments can be designed that allow students to develop an appreciation of the complexities involved in optimization procedures. In the present exercise, a HPLC simulator from "JCE…

  6. Design optimization method for Francis turbine

    Science.gov (United States)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  7. Touchstone: Exploratory Design of Experiments

    OpenAIRE

    2007-01-01

    International audience; Touchstone is an open-source experiment design platform designed to help establish a solid research foundation for HCI in the area of novel interaction techniques. Touchstone includes a design platform for exploring alternative designs of controlled laboratory experiments, a run platform for running subjects and a limited analysis platform for advice and access to on-line statistics packages. Designed for HCI researchers and their students, Touchstone facilitates the p...

  8. The construction of optimal stated choice experiments theory and methods

    CERN Document Server

    Street, Deborah J

    2007-01-01

    The most comprehensive and applied discussion of stated choice experiment constructions available The Construction of Optimal Stated Choice Experiments provides an accessible introduction to the construction methods needed to create the best possible designs for use in modeling decision-making. Many aspects of the design of a generic stated choice experiment are independent of its area of application, and until now there has been no single book describing these constructions. This book begins with a brief description of the various areas where stated choice experiments are applicable, including marketing and health economics, transportation, environmental resource economics, and public welfare analysis. The authors focus on recent research results on the construction of optimal and near-optimal choice experiments and conclude with guidelines and insight on how to properly implement these results. Features of the book include: Construction of generic stated choice experiments for the estimation of main effects...

  9. Design Optimization of Internal Flow Devices

    DEFF Research Database (Denmark)

    Madsen, Jens Ingemann

    The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies.......The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies....

  10. Reliability based design optimization: Formulations and methodologies

    Science.gov (United States)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  11. Integrated multidisciplinary design optimization of rotorcraft

    Science.gov (United States)

    Adelman, Howard M.; Mantay, Wayne R.

    1989-01-01

    The NASA/Army research plan for developing the logic elements for helicopter rotor design optimization by integrating appropriate disciplines and accounting for important interactions among the disciplines is discussed. The paper describes the optimization formulation in terms of the objective function, design variables, and constraints. The analysis aspects are discussed, and an initial effort at defining the interdisciplinary coupling is summarized. Results are presented on the achievements made in the rotor aerodynamic performance optimization for minimum hover horsepower, rotor dynamic optimization for vibration reduction, rotor structural optimization for minimum weight, and integrated aerodynamic load/dynamics optimization for minimum vibration and weight.

  12. Spatial experiences and interaction design

    DEFF Research Database (Denmark)

    Dalsgård, Peter

    2006-01-01

    IT is rapidly spreading to non-desktop environments, and is increasingly being used for post-functional purposes. Recent contributions within the field of interaction design have indicated a tight coupling between physico-spatial and experiential issues, both on a technological and on a theoretical...... level. However, interaction design and HCI yet has little to offer designers working with physico-spatial and experiential issues in practical design cases. In this paper, I argue that experiments that explore spatial and experiential aspects are crucial in developing the practice of interaction design....... These  aspects may be brought to the forefront by engaging in, reflecting upon, and reporting from physico-spatial design experiments, and by making spatial and experience-oriented design representations part of the design process. These experiments may be supported by design representations inspired...

  13. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  14. Gradient-based stochastic optimization methods in Bayesian experimental design

    OpenAIRE

    2012-01-01

    Optimal experimental design (OED) seeks experiments expected to yield the most useful data for some purpose. In practical circumstances where experiments are time-consuming or resource-intensive, OED can yield enormous savings. We pursue OED for nonlinear systems from a Bayesian perspective, with the goal of choosing experiments that are optimal for parameter inference. Our objective in this context is the expected information gain in model parameters, which in general can only be estimated u...

  15. Anaerobic co-digestion of waste activated sludge and greasy sludge from flotation process: batch versus CSTR experiments to investigate optimal design.

    Science.gov (United States)

    Girault, R; Bridoux, G; Nauleau, F; Poullain, C; Buffet, J; Peu, P; Sadowski, A G; Béline, F

    2012-02-01

    In this study, the maximum ratio of greasy sludge to incorporate with waste activated sludge was investigated in batch and CSTR experiments. In batch experiments, inhibition occurred with a greasy sludge ratio of more than 20-30% of the feed COD. In CSTR experiments, the optimal greasy sludge ratio was 60% of the feed COD and inhibition occurred above a ratio of 80%. Hence, batch experiments can predict the CSTR yield when the degradation phenomenon are additive but cannot be used to determine the maximum ratio to be used in a CSTR configuration. Additionally, when the ratio of greasy sludge increased from 0% to 60% of the feed COD, CSTR methane production increased by more than 60%. When the greasy sludge ratio increased from 60% to 90% of the feed COD, the reactor yield decreased by 75%.

  16. Network inference via adaptive optimal design

    Directory of Open Access Journals (Sweden)

    Stigter Johannes D

    2012-09-01

    Full Text Available Abstract Background Current research in network reverse engineering for genetic or metabolic networks very often does not include a proper experimental and/or input design. In this paper we address this issue in more detail and suggest a method that includes an iterative design of experiments based, on the most recent data that become available. The presented approach allows a reliable reconstruction of the network and addresses an important issue, i.e., the analysis and the propagation of uncertainties as they exist in both the data and in our own knowledge. These two types of uncertainties have their immediate ramifications for the uncertainties in the parameter estimates and, hence, are taken into account from the very beginning of our experimental design. Findings The method is demonstrated for two small networks that include a genetic network for mRNA synthesis and degradation and an oscillatory network describing a molecular network underlying adenosine 3’-5’ cyclic monophosphate (cAMP as observed in populations of Dyctyostelium cells. In both cases a substantial reduction in parameter uncertainty was observed. Extension to larger scale networks is possible but needs a more rigorous parameter estimation algorithm that includes sparsity as a constraint in the optimization procedure. Conclusion We conclude that a careful experiment design very often (but not always pays off in terms of reliability in the inferred network topology. For large scale networks a better parameter estimation algorithm is required that includes sparsity as an additional constraint. These algorithms are available in the literature and can also be used in an adaptive optimal design setting as demonstrated in this paper.

  17. Optimal Foraging by Birds: Experiments for Secondary & Postsecondary Students

    Science.gov (United States)

    Pecor, Keith W.; Lake, Ellen C.; Wund, Matthew A.

    2015-01-01

    Optimal foraging theory attempts to explain the foraging patterns observed in animals, including their choice of particular food items and foraging locations. We describe three experiments designed to test hypotheses about food choice and foraging habitat preference using bird feeders. These experiments can be used alone or in combination and can…

  18. Topology Optimization for Architected Materials Design

    Science.gov (United States)

    Osanov, Mikhail; Guest, James K.

    2016-07-01

    Advanced manufacturing processes provide a tremendous opportunity to fabricate materials with precisely defined architectures. To fully leverage these capabilities, however, materials architectures must be optimally designed according to the target application, base material used, and specifics of the fabrication process. Computational topology optimization offers a systematic, mathematically driven framework for navigating this new design challenge. The design problem is posed and solved formally as an optimization problem with unit cell and upscaling mechanics embedded within this formulation. This article briefly reviews the key requirements to apply topology optimization to materials architecture design and discusses several fundamental findings related to optimization of elastic, thermal, and fluidic properties in periodic materials. Emerging areas related to topology optimization for manufacturability and manufacturing variations, nonlinear mechanics, and multiscale design are also discussed.

  19. Design of an Optimal Biorefinery

    DEFF Research Database (Denmark)

    Nawaz, Muhammad; Zondervan, Edwin; Woodley, John

    2011-01-01

    In this paper we propose a biorefinery optimization model that can be used to find the optimal processing route for the production of ethanol, butanol, succinic acid and blends of these chemicals with fossil fuel based gasoline. The approach unites transshipment models with a superstructure...

  20. Design of an Optimal Biorefinery

    DEFF Research Database (Denmark)

    Nawaz, Muhammad; Zondervan, Edwin; Woodley, John

    In this paper we propose a biorefinery optimization model that can be used to find the optimal processing route for the production of ethanol, butanol, succinic acid and blends of these chemicals with fossil fuel based gasoline. The approach unites transshipment models with a superstructure...

  1. Acoustic design by topology optimization

    DEFF Research Database (Denmark)

    Dühring, Maria Bayard; Jensen, Jakob Søndergaard; Sigmund, Ole

    2008-01-01

    To bring down noise levels in human surroundings is an important issue and a method to reduce noise by means of topology optimization is presented here. The acoustic field is modeled by Helmholtz equation and the topology optimization method is based on continuous material interpolation functions...

  2. Design optimization of a torpedo shell structure

    Institute of Scientific and Technical Information of China (English)

    YU De-hai; SONG Bao-wei; LI Jia-wang; YANG Shi-xing

    2008-01-01

    An optimized methodology to design a more robust torpedo shell is proposed. The method has taken into account reliability requirements and controllable and uncontrollable factors such as geometry, load, material properties, manufacturing processes, installation, etc. as well as human and environmental factors. The result is a more realistic shell design. Our reliability optimization design model was developed based on sensitivity analysis. Details of the design model are given in this paper. An example of a torpedo shell design based on this model is given and demonstrates that the method produces designs that are more effective and reliable than traditional torpedo shell designs. This method can be used for other torpedo system designs.

  3. Interactive Reliability-Based Optimal Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle; Siemaszko, A.

    1994-01-01

    Interactive design/optimization of large, complex structural systems is considered. The objective function is assumed to model the expected costs. The constraints are reliability-based and/or related to deterministic code requirements. Solution of this optimization problem is divided in four main...... be used in interactive optimization....

  4. Optimal design of funded pension schemes

    NARCIS (Netherlands)

    Bovenberg, A.L.; Mehlkopf, R.J.

    2014-01-01

    This article reviews the literature on the optimal design and regulation of funded pension schemes. We first characterize optimal saving and investment over an individual’s life cycle. Within a stylized modeling framework, we explore optimal individual saving and investing behavior. Subsequently, va

  5. Optimal experimental design strategies for detecting hormesis.

    Science.gov (United States)

    Dette, Holger; Pepelyshev, Andrey; Wong, Weng Kee

    2011-12-01

    Hormesis is a widely observed phenomenon in many branches of life sciences, ranging from toxicology studies to agronomy, with obvious public health and risk assessment implications. We address optimal experimental design strategies for determining the presence of hormesis in a controlled environment using the recently proposed Hunt-Bowman model. We propose alternative models that have an implicit hormetic threshold, discuss their advantages over current models, and construct and study properties of optimal designs for (i) estimating model parameters, (ii) estimating the threshold dose, and (iii) testing for the presence of hormesis. We also determine maximin optimal designs that maximize the minimum of the design efficiencies when we have multiple design criteria or there is model uncertainty where we have a few plausible models of interest. We apply these optimal design strategies to a teratology study and show that the proposed designs outperform the implemented design by a wide margin for many situations.

  6. Integrated multidisciplinary design optimization of rotorcraft

    Science.gov (United States)

    Adelman, Howard M.; Mantay, Wayne R.

    1989-01-01

    The NASA/Army research plan for developing the logic elements for helicopter rotor design optimization by integrating appropriate disciplines and accounting for important interactions among the disciplines is discussed. The optimization formulation is described in terms of the objective function, design variables, and constraints. The analysis aspects are discussed, and an initial effort at defining the interdisciplinary coupling is summarized. Results are presented on the achievements made in the rotor dynamic optimization for vibration reduction, rotor structural optimization for minimum weight, and integrated aerodynamic load/dynamics optimization for minimum vibration and weight.

  7. Universally optimal crossover designs under subject dropout

    OpenAIRE

    Zheng, Wei

    2013-01-01

    Subject dropout is very common in practical applications of crossover designs. However, there is very limited design literature taking this into account. Optimality results have not yet been well established due to the complexity of the problem. This paper establishes feasible, as well as necessary and sufficient conditions for a crossover design to be universally optimal in approximate design theory in the presence of subject dropout. These conditions are essentially linear equations with re...

  8. Purification optimization for a recombinant single-chain variable fragment against type 1 insulin-like growth factor receptor (IGF-1R) by using design of experiment (DoE).

    Science.gov (United States)

    Song, Yong-Hong; Sun, Xue-Wen; Jiang, Bo; Liu, Ji-En; Su, Xian-Hui

    2015-12-01

    Design of experiment (DoE) is a statistics-based technique for experimental design that could overcome the shortcomings of traditional one-factor-at-a-time (OFAT) approach for protein purification optimization. In this study, a DoE approach was applied for optimizing purification of a recombinant single-chain variable fragment (scFv) against type 1 insulin-like growth factor receptor (IGF-1R) expressed in Escherichia coli. In first capture step using Capto L, a 2-level fractional factorial analysis and successively a central composite circumscribed (CCC) design were used to identify the optimal elution conditions. Two main effects, pH and trehalose, were identified, and high recovery (above 95%) and low aggregates ratio (below 10%) were achieved at the pH range from 2.9 to 3.0 with 32-35% (w/v) trehalose added. In the second step using cation exchange chromatography, an initial screening of media and elution pH and a following CCC design were performed, whereby the optimal selectivity of the scFv was obtained on Capto S at pH near 6.0, and the optimal conditions for fulfilling high DBC and purity were identified as pH range of 5.9-6.1 and loading conductivity range of 5-12.5 mS/cm. Upon a further gel filtration, the final purified scFv with a purity of 98% was obtained. Finally, the optimized conditions were verified by a 20-fold scale-up experiment. The purities and yields of intermediate and final products all fell within the regions predicted by DoE approach, suggesting the robustness of the optimized conditions. We proposed that the DoE approach described here is also applicable in production of other recombinant antibody constructs.

  9. Optimality of a Fully Stressed Design

    Science.gov (United States)

    Patnaik, Surya N.; Hopkins, Dale A.

    1998-01-01

    For a truss a fully stressed state is reached and when all its members are utilized to their full strength capacity. Historically, engineers considered such a design optimum. But recently this optimality has been questioned, especially since the weight of the structure is not explicitly used in fully stressed design calculations. This paper examines optimality of the full stressed design (FSD) with analytical and graphical illustrations. Solutions for a set of examples obtained by using the FSD method and optimization methods numerically confirm the optimality of the FSD. The FSD, which can be obtained with a small amount of calculation, can be extended to displacement constraints and to nontruss-type structures.

  10. Optimal design of multi-conditions for axial flow pump

    Science.gov (United States)

    Shi, L. J.; Tang, F. P.; Liu, C.; Xie, R. S.; Zhang, W. P.

    2016-11-01

    Passage components of the pump device will have a negative flow state when axial pump run off the design condition. Combined with model tests of axial flow pump, this paper use numerical simulation and numerical optimization techniques, and change geometric design parameters of the impeller to optimal design of multi conditions for Axial Flow Pump, in order to improve the efficiency of non-design conditions, broad the high efficient district and reduce operating cost. The results show that, efficiency curve of optimized significantly wider than the initial one without optimization. The efficiency of low flow working point increased by about 2.6%, the designed working point increased by about 0.5%, and the high flow working point increased the most, about 7.4%. The change range of head is small, so all working point can meet the operational requirements. That will greatly reduce operating costs and shorten the period of optimal design. This paper adopted the CFD simulation as the subject analysis, combined with experiment study, instead of artificial way of optimization design with experience, which proves the reliability and efficiency of the optimization design of multi-operation conditions of axial-flow pump device.

  11. Optimization of Multi-Criteria Experiments with Fuzzy Results

    Institute of Scientific and Technical Information of China (English)

    YANG Fan; WU Su; Tilo Pfeifer; Klaus Hense

    2006-01-01

    The classical design of experiments (DoE) method can optimize systems with one technical response and multiple inputs. The objective of this study is to optimize multiple technical responses at the same time by integrating fuzzy logic transformation into a DoE system. The transformation from technical responses to the individual fuzzy responses and the overall fuzzy response are first defined, and the fuzzy response system is established. The method used to optimize the overall fuzzy response is introduced and discussed. The results show that the established fuzzy response system can optimize systems with multiple technical responses and multiple inputs.

  12. A Fast and Scalable Method for A-Optimal Design of Experiments for Infinite-dimensional Bayesian Nonlinear Inverse Problems with Application to Porous Medium Flow

    Science.gov (United States)

    Petra, N.; Alexanderian, A.; Stadler, G.; Ghattas, O.

    2015-12-01

    We address the problem of optimal experimental design (OED) for Bayesian nonlinear inverse problems governed by partial differential equations (PDEs). The inverse problem seeks to infer a parameter field (e.g., the log permeability field in a porous medium flow model problem) from synthetic observations at a set of sensor locations and from the governing PDEs. The goal of the OED problem is to find an optimal placement of sensors so as to minimize the uncertainty in the inferred parameter field. We formulate the OED objective function by generalizing the classical A-optimal experimental design criterion using the expected value of the trace of the posterior covariance. This expected value is computed through sample averaging over the set of likely experimental data. Due to the infinite-dimensional character of the parameter field, we seek an optimization method that solves the OED problem at a cost (measured in the number of forward PDE solves) that is independent of both the parameter and the sensor dimension. To facilitate this goal, we construct a Gaussian approximation to the posterior at the maximum a posteriori probability (MAP) point, and use the resulting covariance operator to define the OED objective function. We use randomized trace estimation to compute the trace of this covariance operator. The resulting OED problem includes as constraints the system of PDEs characterizing the MAP point, and the PDEs describing the action of the covariance (of the Gaussian approximation to the posterior) to vectors. We control the sparsity of the sensor configurations using sparsifying penalty functions, and solve the resulting penalized bilevel optimization problem via an interior-point quasi-Newton method, where gradient information is computed via adjoints. We elaborate our OED method for the problem of determining the optimal sensor configuration to best infer the log permeability field in a porous medium flow problem. Numerical results show that the number of PDE

  13. An improved group search optimizer for mechanical design optimization problems

    Institute of Scientific and Technical Information of China (English)

    Hai Shen; Yunlong Zhu; Ben Niu; Q.H. Wu

    2009-01-01

    This paper presents an improved group search optimizer (iGSO) for solving mechanical design optimization problems.In the pro-posed algorithm,subpopulations and a co-operation evolutionary strategy were adopted to improve the global search capability and convergence performance.The iGSO is evaluated on two optimization problems of classical mechanical design:spring and pressure vessel.The experimental results are analyzed in comparison with those reported in the literatures.The results show that iGSO has much better convergence performance and is easier to implement in comparison with other existing evolutionary algorithms.

  14. Optimal Multiobjective Design of Digital Filters Using Taguchi Optimization Technique

    Science.gov (United States)

    Ouadi, Abderrahmane; Bentarzi, Hamid; Recioui, Abdelmadjid

    2014-01-01

    The multiobjective design of digital filters using the powerful Taguchi optimization technique is considered in this paper. This relatively new optimization tool has been recently introduced to the field of engineering and is based on orthogonal arrays. It is characterized by its robustness, immunity to local optima trapping, relative fast convergence and ease of implementation. The objectives of filter design include matching some desired frequency response while having minimum linear phase; hence, reducing the time response. The results demonstrate that the proposed problem solving approach blended with the use of the Taguchi optimization technique produced filters that fulfill the desired characteristics and are of practical use.

  15. Structural Assembly Demonstration Experiment (SADE) experiment design

    Science.gov (United States)

    Akin, D. L.; Bowden, M. L.

    1982-03-01

    The Structural Assembly Demonstration Experiment concept is to erect a hybrid deployed/assembled structure as an early space experiment in large space structures technology. The basic objectives can be broken down into three generic areas: (1) by performing assembly tasks both in space and in neutral buoyancy simulation, a mathematical basis will be found for the validity conditions of neutral buoyancy, thus enhancing the utility of water as a medium for simulation of weightlessness; (2) a data base will be established describing the capabilities and limitations of EVA crewmembers, including effects of such things as hardware size and crew restraints; and (3) experience of the M.I.T. Space Systems Lab in neutral buoyancy simulation of large space structures assembly indicates that the assembly procedure may create the largest loads that a structure will experience during its lifetime. Data obtained from the experiment will help establish an accurate loading model to aid designers of future space structures.

  16. Optimization, an Important Stage of Engineering Design

    Science.gov (United States)

    Kelley, Todd R.

    2010-01-01

    A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…

  17. Design and fabrication of topologically optimized structures;

    DEFF Research Database (Denmark)

    Feringa, Jelle; Søndergaard, Asbjørn

    2012-01-01

    Integral structural optimization and fabrication seeks the synthesis of two original approaches; that of topological optimization (TO) and robotic hotwire cutting (HWC) (Mcgee 2011). TO allows for the reduction of up to 70% of the volume of concrete to support a given structure (Sondergaard...... & Dombernowsky 2011). A strength of the method is that it allows to come up with structural designs that lie beyond the grasp of traditional means of design. A design space is a discretized volume, delimiting where the optimization will take place. The number of cells used to discretize the design space thus...

  18. Optimization design and experiment of centrifugal pump based on CFD%基于CFD的离心泵优化设计与试验

    Institute of Scientific and Technical Information of China (English)

    赵伟国; 盛建萍; 杨军虎; 宋启策

    2015-01-01

    为了提高离心泵的效率,以叶轮效率最大为优化目标进行优化设计。对叶轮进行参数化设计,以实现叶轮几何形状的自动控制以及为优化计算提供优化变量。选择控制叶片积叠线周向定位的2个参数作为优化变量,以−3°~3°作为优化变量的约束范围。利用人工神经网络的学习功能,建立了目标函数与优化变量之间的映射关系。采用遗传算法寻找目标函数的最优值,得到优化变量约束范围内的最优叶轮模型。数值计算结果表明:在设计流量点1200 m3/h时,优化后叶轮的效率较优化前提高了4.02个百分点,离心泵的效率提高了4.41个百分点,扬程提升了2.63 m。针对非设计工况点性能改善不明显这一问题,对原始蜗壳进行重新设计并与优化叶轮组合进行数值计算。在设计工况点效率提高了1.59%,在1.2倍设计工况点处效率提升了9.93%,在1.4倍设计工况点处效率提升了8.83%;较原始叶轮与原始蜗壳的组合,在设计工况点泵的效率提高了6%,在1.2倍设计工况点点效率提高了9.2%,在1.4倍设计工况点点效率提高了8.59%。优化拓宽了水泵运行的高效区,增强了泵的运行稳定性,离心泵的性能得到了优化,叶轮与蜗壳之间的匹配更合理。该研究对离心泵的优化设计提供了参考。%The centrifugal pump is one of the most widely used fluid machinery. However, 3 problems i.e. lower efficiency, unsteady flow and bad cavitations performance are perplexing the development of centrifugal pump. For a single centrifugal pump, the impeller is one of the most important flow components, so it is selected as the optimum objective. Parametric fitting is a prerequisite in impeller optimization design. This process provides optimization variables and controls impeller automatically for the optimization design. Bezier curve and B-spline curve are used to reconstruct the impeller

  19. Statistical design of experiments as a tool for optimizing the batch conditions to Cr(VI) biosorption on Araucaria angustifolia wastes.

    Science.gov (United States)

    Brasil, Jorge L; Ev, Ricardo R; Milcharek, Caroline D; Martins, Lucas C; Pavan, Flavio A; dos Santos, Araci A; Dias, Silvio L P; Dupont, Jairton; Zapata Noreña, Caciano P; Lima, Eder C

    2006-05-20

    In order to reduce the total number of experiments for achieving the best conditions for Cr(VI) uptake using Araucaria angustifolia (named pinhão) wastes as a biosorbent, three statistical design of experiments were carried out. A full 2(4) factorial design with two blocks and two central points (20 experiments) was experimented (pH, initial metallic ion concentration-C(o), biosorbent concentration-X and time of contact-t), showing that all the factors were significant; besides, several interactions among the factors were also significant. These results led to the performance of a Box-Behnken surface analysis design with three factors (X, C(o) and t) and three central points and just one block (15 experiments). The performance of these two statistical designs of experiments led to the best conditions for Cr(VI) biosorption on the pinhão wastes using a batch system, where: pH 2.0; C(o) = 1200 mg l(-1) Cr(VI); X = 1.5 g l(-1) of biosorbent; t = 8 h. The maximum Cr(VI) uptake in these conditions was 125 mg g(-1). After evaluating the best Cr(VI) biosorption conditions on pinhão wastes, a new Box-Behnken surface analysis design was employed in order to verify the effects of three concomitant ions (Cl(-), NO(3)(-) and PO(4)(3-)) on the biosorption of Cr(VI) as a dichromate on the biosorbent (15 experiments). These results showed that the tested anions did not show any significant effect on the Cr(VI) uptake by pinhão wastes. In order to evaluate the pinhão wastes as a biosorbent in dynamic system, a glass column was fulfilled with pinhão wastes (4.00 g) as biosorbent, and it was fed with 25.0 mg l(-1) Cr(VI) at pH 2.0 and 2.5 ml min(-1). The breakpoint was attained when concentrations of effluent of the column attained the value of 0.05 mg l(-1) Cr(VI) using 5550 ml of the metallic ion solution. In these conditions, the biosorbent was able to remove completely Cr(VI) from aqueous solution with a ratio of Cr(VI) effluent volume/biosorbent volume of 252.3.

  20. Optimization process of crankshaft induction hardening by design of experiment(DOE)%用试验设计(DOE)法优化曲轴感应淬火工艺

    Institute of Scientific and Technical Information of China (English)

    石仕梁; 任业训

    2009-01-01

    The key factors on crankshaft induction hardening were screened out by design of experiment(DOE)based on 6 Sigma tool.The effective hardening depth of the crankshaft was increased by optimizing the response optimization model and linear regression model.%采用6 Sigma方法的试验设计(DOE)工具,筛选出发动机曲轴感应淬火的关键因子.通过响应优化模型和线性回归方法优化工艺参数,使曲轴感应淬火有效硬化层深值得到提高.

  1. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    Science.gov (United States)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  2. A Multidisciplinary Design Optimization Model for AUV Synthetic Conceptual Design

    Institute of Scientific and Technical Information of China (English)

    BU Guang-zhi; ZHANG Yu-wen

    2006-01-01

    Autonomous undersea vehicle (AUV) is a typical complex engineering system. This paper studies the disciplines and coupled variables in AUV design with multidisciplinary design optimization (M DO) methods. The framework of AUV synthetic conceptual design is described first, and then a model with collaborative optimization is studied. At last,an example is given to verify the validity and efficiency of MDO in AUV synthetic conceptual design.

  3. Designing experiments through compressed sensing.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Joseph G.; Ridzal, Denis

    2013-06-01

    In the following paper, we discuss how to design an ensemble of experiments through the use of compressed sensing. Specifically, we show how to conduct a small number of physical experiments and then use compressed sensing to reconstruct a larger set of data. In order to accomplish this, we organize our results into four sections. We begin by extending the theory of compressed sensing to a finite product of Hilbert spaces. Then, we show how these results apply to experiment design. Next, we develop an efficient reconstruction algorithm that allows us to reconstruct experimental data projected onto a finite element basis. Finally, we verify our approach with two computational experiments.

  4. Axial-flow Pump Optimization Design Based on the Orthogonal Experiment%基于正交试验的轴流泵优化设计

    Institute of Scientific and Technical Information of China (English)

    邢树兵; 朱荣生; 朱冬欣; 龙云; 贺博; 曹梁

    2015-01-01

    The optimization design of Axial-flow pumps is done by using the orthogonal design method .Orthogonal scheme of three factors and two levels are designed ,and the influence of the impeller ,guide vane and horn tube on the characteristics are studied . Each scheme is tested .Optimal scheme of the characteristics is found by analyzing the contrast of performance curves .The influence orders of the impeller ,guide vane and horn tube on the characteristics are obtained by range analysis .The optimization parameter combination is obtained with a comprehensive balance analysis and comparison to the data results .The optimization parameter combi‐nation is impeller with blade angle ψ=0° ,guide vane with diversion cone ,and horn tube with the diameter ratio of import and impeller DL/D0 =1 .56 ,the ratio of height and the impeller diameter HL/D0 =0 .82 .The optimal combination of scheme result shows that the head of the rated flow point is higher than the designed value by 6 .16% ,the efficiency of the rated flow point is higher than the pre‐scribed value by 5 .07% .The axial-flow pump has an efficient area .It demonstrates that the experimental purpose is reached and the design method is reasonable .%对轴流泵进行正交试验法优化设计,为了研究叶轮、导叶、喇叭管对轴流泵性能的影响,设计了一个三因素二水平的正交方案。对每个方案进行试验测试,通过分析每个试验方案的性能曲线图,得到了对于各个性能的最优方案,对各个方案的试验数据进行极差分析,得到了轴流泵叶轮、导叶、喇叭管影响性能的主次顺序。通过分析与比较得出最优参数组合,即叶片角度ψ=0°的叶轮,加导流锥的导叶体,进口直径与叶轮直径比值为 DL/D0=1.56、高度 H L/D0=0.82的喇叭管。试验结果表明,最优组合方案在额定流量点扬程高于设计值6.16%,效率比规定值高出5.07%,轴流泵的高效区较

  5. Digital Design of Virtual Prototype based on Multidisciplinary Design Optimization

    Institute of Scientific and Technical Information of China (English)

    WU Baogui; HUANG Hongzhong; TAO Ye

    2006-01-01

    In order to obtain digital design of complex mechanical product as optimal as possible in an efficient way, multidiscipline integrated design method is proposed, which integrates multidisciplinary design optimization (MDO) into digital design process to design virtual prototype (VP) efficiently. Through combining MDO and multi-body system dynamics, MDO integration platform, which takes VP as the core, is constructed. Then automated MDO design of VP is realized and changes of mechanical design project can be expressed intuitively during MDO design process. The proposed approach is also demonstrated by using integrated analyzing flow of vehicle engineering design. The result shows that the method not only can feasibly realize the MDO of VP, but also can solve the optimization problem of vehicle multi-body system dynamic performance. It can be adopted to the digital design of other complex system.

  6. Optimization of a serum-free culture medium for mouse embryonic stem cells using design of experiments (DoE) methodology

    OpenAIRE

    Knöspel, Fanny; Schindler, Rudolf K.; Lübberstedt, Marc; Petzolt, Stephanie; Gerlach, Jörg C.; Zeilinger, Katrin

    2010-01-01

    The in vitro culture behaviour of embryonic stem cells (ESC) is strongly influenced by the culture conditions. Current culture media for expansion of ESC contain some undefined substances. Considering potential clinical translation work with such cells, the use of defined media is desirable. We have used Design of Experiments (DoE) methods to investigate the composition of a serum-free chemically defined culture medium for expansion of mouse embryonic stem cells (mESC). Factor screening analy...

  7. A framework for efficient process development using optimal experimental designs

    NARCIS (Netherlands)

    Ven, P. van de; Bijlsma, S.; Gout, E.; Voort Maarschalk, K. van der; Thissen, U.

    2011-01-01

    Introduction: The aim of this study was to develop and demonstrate a framework assuring efficient process development using fewer experiments than standard experimental designs. Methods: A novel optimality criterion for experimental designs (Iw criterion) is defined that leads to more efficient proc

  8. Optimization design of electromagnetic shielding composites

    Science.gov (United States)

    Qu, Zhaoming; Wang, Qingguo; Qin, Siliang; Hu, Xiaofeng

    2013-03-01

    The effective electromagnetic parameters physical model of composites and prediction formulas of composites' shielding effectiveness and reflectivity were derived based on micromechanics, variational principle and electromagnetic wave transmission theory. The multi-objective optimization design of multilayer composites was carried out using genetic algorithm. The optimized results indicate that material parameter proportioning of biggest absorption ability can be acquired under the condition of the minimum shielding effectiveness can be satisfied in certain frequency band. The validity of optimization design model was verified and the scheme has certain theoretical value and directive significance to the design of high efficiency shielding composites.

  9. Topology optimization design of space rectangular mirror

    Science.gov (United States)

    Qu, Yanjun; Wang, Wei; Liu, Bei; Li, Xupeng

    2016-10-01

    A conceptual lightweight rectangular mirror is designed based on the theory of topology optimization and the specific structure size is determined through sensitivity analysis and size optimization in this paper. Under the load condition of gravity along the optical axis, compared with the mirrors designed by traditional method using finite element analysis method, the performance of the topology optimization reflectors supported by peripheral six points are superior in lightweight ratio, structure stiffness and the reflective surface accuracy. This suggests that the lightweight method in this paper is effective and has potential value for the design of rectangular reflector.

  10. Optimized design of low energy buildings

    DEFF Research Database (Denmark)

    Rudbeck, Claus Christian; Esbensen, Peter Kjær; Svendsen, Sv Aa Højgaard

    1999-01-01

    concern which can be seen during the construction of new buildings. People want energy-friendly solutions, but they should be economical optimized. An exonomical optimized building design with respect to energy consumption is the design with the lowest total cost (investment plus operational cost over its...... to evaluate different separate solutions when they interact in the building.When trying to optimize several parameters there is a need for a method, which will show the correct price-performance of each part of a building under design. The problem with not having such a method will first be showed...

  11. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  12. Optimization methods applied to hybrid vehicle design

    Science.gov (United States)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  13. Optimal covariate designs theory and applications

    CERN Document Server

    Das, Premadhis; Mandal, Nripes Kumar; Sinha, Bikas Kumar

    2015-01-01

    This book primarily addresses the optimality aspects of covariate designs. A covariate model is a combination of ANOVA and regression models. Optimal estimation of the parameters of the model using a suitable choice of designs is of great importance; as such choices allow experimenters to extract maximum information for the unknown model parameters. The main emphasis of this monograph is to start with an assumed covariate model in combination with some standard ANOVA set-ups such as CRD, RBD, BIBD, GDD, BTIBD, BPEBD, cross-over, multi-factor, split-plot and strip-plot designs, treatment control designs, etc. and discuss the nature and availability of optimal covariate designs. In some situations, optimal estimations of both ANOVA and the regression parameters are provided. Global optimality and D-optimality criteria are mainly used in selecting the design. The standard optimality results of both discrete and continuous set-ups have been adapted, and several novel combinatorial techniques have been applied for...

  14. Optimal screening designs for biomedical technology

    Energy Technology Data Exchange (ETDEWEB)

    Torney, D.C.; Bruno, W.J.; Knill, E. [and others

    1997-10-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Screening a large number of different types of molecules to isolate a few with desirable properties is essential in biomedical technology. For example, trying to find a particular gene in the Human genome could be akin to looking for a needle in a haystack. Fortunately, testing of mixtures, or pools, of molecules allows the desirable ones to be identified, using a number of experiments proportional only to the logarithm of the total number of experiments proportional only to the logarithm of the total number of types of molecules. We show how to capitalize upon this potential by using optimize pooling schemes, or designs. We propose efficient non-adaptive pooling designs, such as {open_quotes}random sets{close_quotes} designs and modified {open_quotes}row and column{close_quotes} designs. Our results have been applied in the pooling and unique-sequence screening of clone libraries used in the Human Genome Project and in the mapping of Human chromosome 16. This required the use of liquid-transferring robots and manifolds--for the largest clone libraries. Finally, we developed an efficient technique for finding the posterior probability each molecule has the desirable property, given the pool assay results. This technique works well, in practice, even if there are substantial rates of errors in the pool assay data. Both our methods and our results are relevant to a broad spectrum of research in modern biology.

  15. Optimal Design of Tidal Power Generator Using Stochastic Optimization Techniques

    OpenAIRE

    2014-01-01

    Particle Swarm Optimization (PSO) and Genetic Algorithms (GA) are usedto reduce the cost of a permanent magnet synchronous generator with concentratedwindings for tidal power applications. Reducing the cost of the electricalmachine is one way of making tidal energy more competitive compared to traditionalsources of electricity.Hybrid optimization combining PSO or GA with gradient based algorithmsseems to be suited for design of electrical machines. Results from optimizationwith Matlab indicat...

  16. Optimization in Data Cube System Design

    Institute of Scientific and Technical Information of China (English)

    YilongLiang; ShaoweiXia

    2004-01-01

    The design of an OLAP system for supporting real-time queries is one of the major research issues.One approach is to use data cubes,which are pre-computed multidimensional views of data in the data warehouse.An initial set of data cubes can be derived.from which the answer to each frequently asked query can be retrieved directly.However,there are two practical problems concerning the design of a cube based system:1)the maintenance cost of the data cubes,and 2)the query cost to answer a selected set of frequently asked queries.Maintaining a data cube requires disk storage and CPU computation,So the maintenance cost is related to the total size of the data cubes materialized,and thus keeping all data cubes is impractical.The total size of cubes may be reduced by merging some cubes.However,the resulting larger cubes will increase the query cost of answering some queries.If the bounds on maintenance cost and query cost are strict.some of the queries need to be sacrificed.An optimization problem in data cube system design has been defined.With a maintenance-cost bound and a query-cost bound given by the user,it is necessary to opti-mize the initial set of data cubes such that the system can answer a maximum number of queries and satisfy the bounds.This is an NP-complete problem.Approximate algorithms Greedy Removing(GR)and 2-Greedy Merging with Multiple paths(2GGM)are proposed.Experiments have been done on a census database and the results show that our approach in both effbctive and efficient.

  17. PARAMETRIC OPTIMIZATION AND STRUCTURAL DESIGN OF NLS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A physical and mathematical model is developed for describing nitrogen launching system (NLS)based on the dynamics of pneumatics and mechanisms. The multi-objective optimization function for the pitching angle and velocity of a missile was proposed for the first time. Singularity detection of wavelet analysis was conducted to find the optimum iteration points in a new direct algorithm of nonlinear programming. Comparison between wavelet optimization and complex method show that the former is better for optimization design.``

  18. Optimization of the Ion Source-Mass Spectrometry Parameters in Non-Steroidal Anti-Inflammatory and Analgesic Pharmaceuticals Analysis by a Design of Experiments Approach.

    Science.gov (United States)

    Paíga, Paula; Silva, Luís M S; Delerue-Matos, Cristina

    2016-10-01

    The flow rates of drying and nebulizing gas, heat block and desolvation line temperatures and interface voltage are potential electrospray ionization parameters as they may enhance sensitivity of the mass spectrometer. The conditions that give higher sensitivity of 13 pharmaceuticals were explored. First, Plackett-Burman design was implemented to screen significant factors, and it was concluded that interface voltage and nebulizing gas flow were the only factors that influence the intensity signal for all pharmaceuticals. This fractionated factorial design was projected to set a full 2(2) factorial design with center points. The lack-of-fit test proved to be significant. Then, a central composite face-centered design was conducted. Finally, a stepwise multiple linear regression and subsequently an optimization problem solving were carried out. Two main drug clusters were found concerning the signal intensities of all runs of the augmented factorial design. p-Aminophenol, salicylic acid, and nimesulide constitute one cluster as a result of showing much higher sensitivity than the remaining drugs. The other cluster is more homogeneous with some sub-clusters comprising one pharmaceutical and its respective metabolite. It was observed that instrumental signal increased when both significant factors increased with maximum signal occurring when both codified factors are set at level +1. It was also found that, for most of the pharmaceuticals, interface voltage influences the intensity of the instrument more than the nebulizing gas flowrate. The only exceptions refer to nimesulide where the relative importance of the factors is reversed and still salicylic acid where both factors equally influence the instrumental signal. Graphical Abstract ᅟ.

  19. Optimization of the Ion Source-Mass Spectrometry Parameters in Non-Steroidal Anti-Inflammatory and Analgesic Pharmaceuticals Analysis by a Design of Experiments Approach

    Science.gov (United States)

    Paíga, Paula; Silva, Luís M. S.; Delerue-Matos, Cristina

    2016-10-01

    The flow rates of drying and nebulizing gas, heat block and desolvation line temperatures and interface voltage are potential electrospray ionization parameters as they may enhance sensitivity of the mass spectrometer. The conditions that give higher sensitivity of 13 pharmaceuticals were explored. First, Plackett-Burman design was implemented to screen significant factors, and it was concluded that interface voltage and nebulizing gas flow were the only factors that influence the intensity signal for all pharmaceuticals. This fractionated factorial design was projected to set a full 22 factorial design with center points. The lack-of-fit test proved to be significant. Then, a central composite face-centered design was conducted. Finally, a stepwise multiple linear regression and subsequently an optimization problem solving were carried out. Two main drug clusters were found concerning the signal intensities of all runs of the augmented factorial design. p-Aminophenol, salicylic acid, and nimesulide constitute one cluster as a result of showing much higher sensitivity than the remaining drugs. The other cluster is more homogeneous with some sub-clusters comprising one pharmaceutical and its respective metabolite. It was observed that instrumental signal increased when both significant factors increased with maximum signal occurring when both codified factors are set at level +1. It was also found that, for most of the pharmaceuticals, interface voltage influences the intensity of the instrument more than the nebulizing gas flowrate. The only exceptions refer to nimesulide where the relative importance of the factors is reversed and still salicylic acid where both factors equally influence the instrumental signal.

  20. Optimality criteria solution strategies in multiple constraint design optimization

    Science.gov (United States)

    Levy, R.; Parzynski, W.

    1981-01-01

    Procedures and solution strategies are described to solve the conventional structural optimization problem using the Lagrange multiplier technique. The multipliers, obtained through solution of an auxiliary nonlinear optimization problem, lead to optimality criteria to determine the design variables. It is shown that this procedure is essentially equivalent to an alternative formulation using a dual method Lagrangian function objective. Although mathematical formulations are straight-forward, successful applications and computational efficiency depend upon execution procedure strategies. Strategies examined, with application examples, include selection of active constraints, move limits, line search procedures, and side constraint boundaries.

  1. Interaction Prediction Optimization in Multidisciplinary Design Optimization Problems

    Directory of Open Access Journals (Sweden)

    Debiao Meng

    2014-01-01

    Full Text Available The distributed strategy of Collaborative Optimization (CO is suitable for large-scale engineering systems. However, it is hard for CO to converge when there is a high level coupled dimension. Furthermore, the discipline objectives cannot be considered in each discipline optimization problem. In this paper, one large-scale systems control strategy, the interaction prediction method (IPM, is introduced to enhance CO. IPM is utilized for controlling subsystems and coordinating the produce process in large-scale systems originally. We combine the strategy of IPM with CO and propose the Interaction Prediction Optimization (IPO method to solve MDO problems. As a hierarchical strategy, there are a system level and a subsystem level in IPO. The interaction design variables (including shared design variables and linking design variables are operated at the system level and assigned to the subsystem level as design parameters. Each discipline objective is considered and optimized at the subsystem level simultaneously. The values of design variables are transported between system level and subsystem level. The compatibility constraints are replaced with the enhanced compatibility constraints to reduce the dimension of design variables in compatibility constraints. Two examples are presented to show the potential application of IPO for MDO.

  2. Optimal Bayesian Experimental Design for Combustion Kinetics

    KAUST Repository

    Huan, Xun

    2011-01-04

    Experimental diagnostics play an essential role in the development and refinement of chemical kinetic models, whether for the combustion of common complex hydrocarbons or of emerging alternative fuels. Questions of experimental design—e.g., which variables or species to interrogate, at what resolution and under what conditions—are extremely important in this context, particularly when experimental resources are limited. This paper attempts to answer such questions in a rigorous and systematic way. We propose a Bayesian framework for optimal experimental design with nonlinear simulation-based models. While the framework is broadly applicable, we use it to infer rate parameters in a combustion system with detailed kinetics. The framework introduces a utility function that reflects the expected information gain from a particular experiment. Straightforward evaluation (and maximization) of this utility function requires Monte Carlo sampling, which is infeasible with computationally intensive models. Instead, we construct a polynomial surrogate for the dependence of experimental observables on model parameters and design conditions, with the help of dimension-adaptive sparse quadrature. Results demonstrate the efficiency and accuracy of the surrogate, as well as the considerable effectiveness of the experimental design framework in choosing informative experimental conditions.

  3. Switched reluctance motor optimal geometry design

    Directory of Open Access Journals (Sweden)

    Liviu Neamt

    2010-12-01

    Full Text Available This paper deals with the Switched Reluctance Motor (SRM analysis using Finite Element Method (FEM for geometrical optimization in terms of volume ratio of torque on the rotor, the so-called specific torque. The optimization parameter is the pair: stator and rotor pole angles, which forms a crucial part of the design process.

  4. Optimal Design of Laminated Composite Beams

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral

    . Furthermore, the devised beam model is able account for the different levels of anisotropic elastic couplings which depend on the laminate lay-up. An optimization model based on multi-material topology optimization techniques is described. The design variables represent the volume fractions of the different...

  5. Discrete design optimization accounting for practical constraints

    NARCIS (Netherlands)

    Schevenels, M.; McGinn, S.; Rolvink, A.; Coenders, J.L.

    2013-01-01

    This paper presents a heuristic algorithm for discrete design optimization, based on the optimality criteria method. Practical applicability is the first concern; special attention is therefore paid to the implementation of technological constraints. The method is generally applicable, but in order

  6. Strategies for Optimal Design of Structural Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1992-01-01

    Reliability-based design of structural systems is considered. Especially systems where the reliability model is a series system of parallel systems are analysed. A sensitivity analysis for this class of problems is presented. Direct and sequential optimization procedures to solve the optimization...... problems are described. Numerical tests indicate that a sequential technique called the bounds iteration method (BIM) is particularly fast and stable....

  7. A semi-custom design methodology for design performance optimization

    Institute of Scientific and Technical Information of China (English)

    Dong-ming LV; Pei-yong ZHANG; Dan-dan ZHENG; Xiao-lang YAN; Bo ZHANG; Li QUAN

    2008-01-01

    We present a semi-custom design methodology based on transistor tuning to optimize the design performance.Compared with other transistor tuning approaches, our tuning process takes the cross-talk effect into account and prominently reduces the complexity for circuit simulation and analysis by decomposing the circuit network utilizing graph theory. Furthermore,the incremental placement and routing for the corresponding transistor tuning in conventional approaches is not required in our methodology, which might induce timing graph variation and additional iterations for design convergence. This methodology combines the flexible automated circuit tuning and physical design tools to provide more opportunities for design optimization throughout the design cycle.

  8. Turbomachinery Airfoil Design Optimization Using Differential Evolution

    Science.gov (United States)

    Madavan, Nateri K.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    An aerodynamic design optimization procedure that is based on a evolutionary algorithm known at Differential Evolution is described. Differential Evolution is a simple, fast, and robust evolutionary strategy that has been proven effective in determining the global optimum for several difficult optimization problems, including highly nonlinear systems with discontinuities and multiple local optima. The method is combined with a Navier-Stokes solver that evaluates the various intermediate designs and provides inputs to the optimization procedure. An efficient constraint handling mechanism is also incorporated. Results are presented for the inverse design of a turbine airfoil from a modern jet engine and compared to earlier methods. The capability of the method to search large design spaces and obtain the optimal airfoils in an automatic fashion is demonstrated. Substantial reductions in the overall computing time requirements are achieved by using the algorithm in conjunction with neural networks.

  9. OPTIMAL DESIGN OF SMART ANTENNA ARRAY

    Institute of Scientific and Technical Information of China (English)

    Gao Feng; Liu Qizhong; Shan Runhong; Zhang Hou

    2004-01-01

    This letter investigates an efficient design procedure integrating the Genetic Algorithm (GA) with the Finite Difference Time Domain (FDTD) for the fast optimal design of Smart Antenna Arrays (SAA). The FDTD is used to analyze SAA with mutual coupling. Then,on the basis of the Maximal Signal to Noise Ratio (MSNR) criteria, the GA is applied to the optimization of weighting elements and structure of SAA. Finally, the effectiveness of the analysis is evaluated by experimental antenna arrays.

  10. Optimizing Design of UHVDC Converter Stations

    Institute of Scientific and Technical Information of China (English)

    MA Weimin; NIE Dingzhen; CAO Yanming

    2012-01-01

    Based on the consultation and study for Xiangjiaba-Shanghai ±800 kV UHVDC(ultra high voltage direct current) project, this paper presents an optimal design for key technique solutions. In this paper, the DC system electrical scheme design, the DC filter design, the DC harmonic component suppression, the over voltage and insulation coordination, the requirements for converter station equipment, the main equipment technical parameters of equipment (including thyristor valve, converter transformer, smoothing reactor, DC breaker), the configuration of measuring device and DC control protection system, and the de-icing operation design are investigated. According to the UHVDC technology researched conclusions and the development of the project construction, the UHVDC system design for converter stations becomes an optimal combination. The optimized design solves numbers of technical problems of the world's first UHVDC project, and it is applied to the project's construction. Under the actual operating condition, the optimized design is proved to be correct and superior. These optimal design conclusions are impartment for developing UHVDC technique and equipment, and provide reference for future UHVDC projects.

  11. Systematic design of microstructures by topology optimization

    DEFF Research Database (Denmark)

    Sigmund, Ole

    2003-01-01

    The topology optimization method can be used to determine the material distribution in a design domain such that an objective function is maximized and constraints are fulfilled. The method which is based on Finite Element Analysis may be applied to all kinds of material distribution problems like...... extremal material design, sensor and actuator design and MEMS synthesis. The state-of-the-art in topology optimization will be reviewed and older as well as new applications in phononic and photonic crystals design will be presented....

  12. EXISTENCE OF OPTIMAL STRONG PARTIALLY BALANCED DESIGNS

    Institute of Scientific and Technical Information of China (English)

    Du Beiliang

    2007-01-01

    A strong partially balanced design SPBD(v, b, k; λ,0) whose b is the maximum number of blocks in all SPBD(v, b, k; λ, 0), as an optimal strong partially balanced design, briefly OSPBD(v, k, λ) is studied. In investigation of authentication codes it has been found that the strong partially balanced design can be used to construct authentication codes. This note investigates the existence of optimal strong partially balanced design OSPBD(v, k, 1) for k = 3and 4, and shows that there exists an OSPBD(v, k, 1) for any v ≥ k.

  13. Design and analysis of experiments

    CERN Document Server

    Dean, Angela; Draguljić, Danel

    2017-01-01

    This textbook takes a strategic approach to the broad-reaching subject of experimental design by identifying the objectives behind an experiment and teaching practical considerations that govern design and implementation, concepts that serve as the basis for the analytical techniques covered. Rather than a collection of miscellaneous approaches, chapters build on the planning, running, and analyzing of simple experiments in an approach that results from decades of teaching the subject. In most experiments, the procedures can be reproduced by readers, thus giving them a broad exposure to experiments that are simple enough to be followed through their entire course. Outlines of student and published experiments appear throughout the text and as exercises at the end of the chapters. The authors develop the theory of estimable functions and analysis of variance with detail, but at a mathematical level that is simultaneously approachable. Throughout the book, statistical aspects of analysis complement practical as...

  14. Performative Computation-aided Design Optimization

    Directory of Open Access Journals (Sweden)

    Ming Tang

    2012-12-01

    Full Text Available This article discusses a collaborative research and teaching project between the University of Cincinnati, Perkins+Will’s Tech Lab, and the University of North Carolina Greensboro. The primary investigation focuses on the simulation, optimization, and generation of architectural designs using performance-based computational design approaches. The projects examine various design methods, including relationships between building form, performance and the use of proprietary software tools for parametric design.

  15. Product model structure for generalized optimal design

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The framework of the generalized optimization product model with the core of network- and tree-hierarchical structure is advanced to improve the characteristics of the generalized optimal design. Based on the proposed node-repetition technique, a network-hierarchical structure is united with the tree-hierarchical structure to facilitate the modeling of serialization and combination products. The criteria for product decomposition are investigated. Seven tree nodes are defined for the construction of a general product model, and their modeling properties are studied in detail. The developed product modeling system is applied and examined successfully in the modeling practice of the generalized optimal design for a hydraulic excavator.

  16. Optimized design of low energy buildings

    DEFF Research Database (Denmark)

    Rudbeck, Claus Christian; Esbensen, Peter Kjær; Svendsen, Sv Aa Højgaard

    1999-01-01

    by 33% compared to current level and that the CO2 emission should be halved. This calls for sustainable development in the building sector, but at the same time, it has to be economically efficient. People are conscious about savings in energy, but consideration to economic aspects are their primary...... concern which can be seen during the construction of new buildings. People want energy-friendly solutions, but they should be economical optimized. An exonomical optimized building design with respect to energy consumption is the design with the lowest total cost (investment plus operational cost over its...... life time). The design and construction of buildings should take into account both energy, environment and economical aspects. The design of a building is very complex and the work on optimizing the design raises several questions. Which criteria are the decisive when choosing a solution? How...

  17. Design optimization for active twist rotor blades

    Science.gov (United States)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to

  18. DESIGN OPTIMIZATION METHOD USED IN MECHANICAL ENGINEERING

    Directory of Open Access Journals (Sweden)

    SCURTU Iacob Liviu

    2016-11-01

    Full Text Available This paper presents an optimization study in mechanical engineering. First part of the research describe the structural optimization method used, followed by the presentation of several optimization studies conducted in recent years. The second part of the paper presents the CAD modelling of an agricultural plough component. The beam of the plough is analysed using finite element method. The plough component is meshed in solid elements, and the load case which mimics the working conditions of agricultural equipment of this are created. The model is prepared to find the optimal structural design, after the FEA study of the model is done. The mass reduction of part is the criterion applied for this optimization study. The end of this research presents the final results and the model optimized shape.

  19. MicroarrayDesigner: an online search tool and repository for near-optimal microarray experimental designs

    Directory of Open Access Journals (Sweden)

    Ferhatosmanoglu Nilgun

    2009-09-01

    Full Text Available Abstract Background Dual-channel microarray experiments are commonly employed for inference of differential gene expressions across varying organisms and experimental conditions. The design of dual-channel microarray experiments that can help minimize the errors in the resulting inferences has recently received increasing attention. However, a general and scalable search tool and a corresponding database of optimal designs were still missing. Description An efficient and scalable search method for finding near-optimal dual-channel microarray designs, based on a greedy hill-climbing optimization strategy, has been developed. It is empirically shown that this method can successfully and efficiently find near-optimal designs. Additionally, an improved interwoven loop design construction algorithm has been developed to provide an easily computable general class of near-optimal designs. Finally, in order to make the best results readily available to biologists, a continuously evolving catalog of near-optimal designs is provided. Conclusion A new search algorithm and database for near-optimal microarray designs have been developed. The search tool and the database are accessible via the World Wide Web at http://db.cse.ohio-state.edu/MicroarrayDesigner. Source code and binary distributions are available for academic use upon request.

  20. Controller Design Automation for Aeroservoelastic Design Optimization of Wind Turbines

    NARCIS (Netherlands)

    Ashuri, T.; Van Bussel, G.J.W.; Zaayer, M.B.; Van Kuik, G.A.M.

    2010-01-01

    The purpose of this paper is to integrate the controller design of wind turbines with structure and aerodynamic analysis and use the final product in the design optimization process (DOP) of wind turbines. To do that, the controller design is automated and integrated with an aeroelastic simulation

  1. Design Buildings Optimally: A Lifecycle Assessment Approach

    KAUST Repository

    Hosny, Ossama

    2013-01-01

    This paper structures a generic framework to support optimum design for multi-buildings in desert environment. The framework is targeting an environmental friendly design with minimum lifecycle cost, using Genetic Algorithms (Gas). GAs function through a set of success measures which evaluates the design, formulates a proper objective, and reflects possible tangible/intangible constraints. The framework optimizes the design and categorizes it under a certain environmental category at minimum Life Cycle Cost (LCC). It consists of three main modules: (1) a custom Building InformationModel (BIM) for desert buildings with a compatibility checker as a central interactive database; (2) a system evaluator module to evaluate the proposed success measures for the design; and (3) a GAs optimization module to ensure optimum design. The framework functions through three levels: the building components, integrated building, and multi-building levels. At the component level the design team should be able to select components in a designed sequence to ensure compatibility among various components, while at the building level; the team can relatively locate and orient each individual building. Finally, at the multi-building (compound) level the whole design can be evaluated using success measures of natural light, site capacity, shading impact on natural lighting, thermal change, visual access and energy saving. The framework through genetic algorithms optimizes the design by determining proper types of building components and relative buildings locations and orientations which ensure categorizing the design under a specific category or meet certain preferences at minimum lifecycle cost.

  2. Thermal Characterization of Functionally Graded Materials: Design of Optimum Experiments

    Science.gov (United States)

    Cole, Kevin D.

    2003-01-01

    This paper is a study of optimal experiment design applied to the measure of thermal properties in functionally graded materials. As a first step, a material with linearly-varying thermal properties is analyzed, and several different tran- sient experimental designs are discussed. An optimality criterion, based on sen- sitivity coefficients, is used to identify the best experimental design. Simulated experimental results are analyzed to verify that the identified best experiment design has the smallest errors in the estimated parameters. This procedure is general and can be applied to design of experiments for a variety of materials.

  3. Design optimization of shape memory alloy structures

    NARCIS (Netherlands)

    Langelaar, M.

    2006-01-01

    This thesis explores the possibilities of design optimization techniques for designing shape memory alloy structures. Shape memory alloys are materials which, after deformation, can recover their initial shape when heated. This effect can be used for actuation. Emerging applications for shape memory

  4. Design optimization of shape memory alloy structures

    NARCIS (Netherlands)

    Langelaar, M.

    2006-01-01

    This thesis explores the possibilities of design optimization techniques for designing shape memory alloy structures. Shape memory alloys are materials which, after deformation, can recover their initial shape when heated. This effect can be used for actuation. Emerging applications for shape memory

  5. A design optimization methodology for Li+ batteries

    Science.gov (United States)

    Golmon, Stephanie; Maute, Kurt; Dunn, Martin L.

    2014-05-01

    Design optimization for functionally graded battery electrodes is shown to improve the usable energy capacity of Li batteries predicted by computational simulations and numerically optimizing the electrode porosities and particle radii. A multi-scale battery model which accounts for nonlinear transient transport processes, electrochemical reactions, and mechanical deformations is used to predict the usable energy storage capacity of the battery over a range of discharge rates. A multi-objective formulation of the design problem is introduced to maximize the usable capacity over a range of discharge rates while limiting the mechanical stresses. The optimization problem is solved via a gradient based optimization. A LiMn2O4 cathode is simulated with a PEO-LiCF3SO3 electrolyte and both a Li Foil (half cell) and LiC6 anode. Studies were performed on both half and full cell configurations resulting in distinctly different optimal electrode designs. The numerical results show that the highest rate discharge drives the simulations and the optimal designs are dominated by Li+ transport rates. The results also suggest that spatially varying electrode porosities and active particle sizes provides an efficient approach to improve the power-to-energy density of Li+ batteries. For the half cell configuration, the optimal design improves the discharge capacity by 29% while for the full cell the discharge capacity was improved 61% relative to an initial design with a uniform electrode structure. Most of the improvement in capacity was due to the spatially varying porosity, with up to 5% of the gains attributed to the particle radii design variables.

  6. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  7. Efficiency Improvements of Antenna Optimization Using Orthogonal Fractional Experiments

    Directory of Open Access Journals (Sweden)

    Yen-Sheng Chen

    2015-01-01

    Full Text Available This paper presents an extremely efficient method for antenna design and optimization. Traditionally, antenna optimization relies on nature-inspired heuristic algorithms, which are time-consuming due to their blind-search nature. In contrast, design of experiments (DOE uses a completely different framework from heuristic algorithms, reducing the design cycle by formulating the surrogates of a design problem. However, the number of required simulations grows exponentially if a full factorial design is used. In this paper, a much more efficient technique is presented to achieve substantial time savings. By using orthogonal fractional experiments, only a small subset of the full factorial design is required, yet the resultant response surface models are still effective. The capability of orthogonal fractional experiments is demonstrated through three examples, including two tag antennas for radio-frequency identification (RFID applications and one internal antenna for long-term-evolution (LTE handheld devices. In these examples, orthogonal fractional experiments greatly improve the efficiency of DOE, thereby facilitating the antenna design with less simulation runs.

  8. Design Process Optimization Based on Design Process Gene Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Bo; TONG Shu-rong

    2011-01-01

    The idea of genetic engineering is introduced into the area of product design to improve the design efficiency. A method towards design process optimization based on the design process gene is proposed through analyzing the correlation between the design process gene and characteristics of the design process. The concept of the design process gene is analyzed and categorized into five categories that are the task specification gene, the concept design gene, the overall design gene, the detailed design gene and the processing design gene in the light of five design phases. The elements and their interactions involved in each kind of design process gene signprocess gene mapping is drawn with its structure disclosed based on its function that process gene.

  9. Optimal Design of a Centrifugal Compressor Impeller Using Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Soo-Yong Cho

    2012-01-01

    Full Text Available An optimization study was conducted on a centrifugal compressor. Eight design variables were chosen from the control points for the Bezier curves which widely influenced the geometric variation; four design variables were selected to optimize the flow passage between the hub and the shroud, and other four design variables were used to improve the performance of the impeller blade. As an optimization algorithm, an artificial neural network (ANN was adopted. Initially, the design of experiments was applied to set up the initial data space of the ANN, which was improved during the optimization process using a genetic algorithm. If a result of the ANN reached a higher level, that result was re-calculated by computational fluid dynamics (CFD and was applied to develop a new ANN. The prediction difference between the ANN and CFD was consequently less than 1% after the 6th generation. Using this optimization technique, the computational time for the optimization was greatly reduced and the accuracy of the optimization algorithm was increased. The efficiency was improved by 1.4% without losing the pressure ratio, and Pareto-optimal solutions of the efficiency versus the pressure ratio were obtained through the 21st generation.

  10. Optimal Design of a Subsonic Submerged Inlet

    Science.gov (United States)

    Taskinoglu, Ezgi; Jovanovic, Vasilije; Elliott, Gregory; Knight, Doyle

    2003-11-01

    A multi-objective optimization study based on an epsilon-constraint method is conducted for the design optimization of a subsonic submerged air vehicle inlet. The multi-objective optimization problem is reformulated by minimizing one of the objectives and restricting the other objectives within user specified values. The figures of merits are the engine-face distortion and swirl that determines the inlet/engine compatibility. The distortion index is minimized while the feasible design space is determined by the swirl index. The design variables are the geometrical parameters defining the surface alteration. The design algorithm is driven by a gradient-based optimizer, and is constructed by integrating the optimizer with a solid modeller (Pro/Engineer), a mesh generator (Grid/Pro) and a flow solver (GASPex). The optimizer is CFSQP (C code for Feasible Sequential Quadratic Programming). Integration of the software packages is achieved by a Perl script. In order to verify the numerical results, an experimental setup for the same inlet geometry is prepared to run at the same flow conditions. The presentation will describe the numerical approach and summarize the results.

  11. Information optimal compressive sensing: static measurement design.

    Science.gov (United States)

    Ashok, Amit; Huang, Liang-Chih; Neifeld, Mark A

    2013-05-01

    The compressive sensing paradigm exploits the inherent sparsity/compressibility of signals to reduce the number of measurements required for reliable reconstruction/recovery. In many applications additional prior information beyond signal sparsity, such as structure in sparsity, is available, and current efforts are mainly limited to exploiting that information exclusively in the signal reconstruction problem. In this work, we describe an information-theoretic framework that incorporates the additional prior information as well as appropriate measurement constraints in the design of compressive measurements. Using a Gaussian binomial mixture prior we design and analyze the performance of optimized projections relative to random projections under two specific design constraints and different operating measurement signal-to-noise ratio (SNR) regimes. We find that the information-optimized designs yield significant, in some cases nearly an order of magnitude, improvements in the reconstruction performance with respect to the random projections. These improvements are especially notable in the low measurement SNR regime where the energy-efficient design of optimized projections is most advantageous. In such cases, the optimized projection design departs significantly from random projections in terms of their incoherence with the representation basis. In fact, we find that the maximizing incoherence of projections with the representation basis is not necessarily optimal in the presence of additional prior information and finite measurement noise/error. We also apply the information-optimized projections to the compressive image formation problem for natural scenes, and the improved visual quality of reconstructed images with respect to random projections and other compressive measurement design affirms the overall effectiveness of the information-theoretic design framework.

  12. Heat exchanger design based on economic optimization

    Energy Technology Data Exchange (ETDEWEB)

    Caputo, Antonio C.; Pelagagge, Marcello P.; Salini, Paolo [University of l' Aquila (Italy). Faculty of Engineering], e-mail: caputo@ing.inivaq.it, e-mail: pelmar@ing.inivaq.it, e-mail: salini@ing.inivaq.it

    2006-07-01

    Owing to the wide utilization of heat exchangers in industrial processes their cost minimization is an important target for both designers and users. Traditional design approaches are based on iterative procedures which assume a configuration and gradually change design parameters until a satisfying solution is reached which meets the design specifications. However, such methods, besides being time consuming, do not guarantee the reach of an optimal solution. In this paper a procedure for optimal design for shell and tube heat exchangers is proposed which utilizes a genetic algorithm to minimize the total discounted cost of the equipment including the capital investment and pumping related annual energy expenditures. In order to verify the performances of the proposed method four case studies are also presented showing that total cost reductions greater than 15% are feasible respect traditionally designed exchangers. (author)

  13. Optimization-based controller design for rotorcraft

    Science.gov (United States)

    Tsing, N.-K.; Fan, M. K. H.; Barlow, J.; Tits, A. L.; Tischler, M. B.

    1993-01-01

    An optimization-based methodology for linear control system design is outlined by considering the design of a controller for a UH-60 rotorcraft in hover. A wide range of design specifications is taken into account: internal stability, decoupling between longitudinal and lateral motions, handling qualities, and rejection of windgusts. These specifications are investigated while taking into account physical limitations in the swashplate displacements and rates of displacement. The methodology crucially relies on user-machine interaction for tradeoff exploration.

  14. Design of experiments to assess pre-treatment and co-digestion strategies that optimize biogas production from macroalgae Gracilaria vermiculophylla.

    Science.gov (United States)

    Oliveira, J V; Alves, M M; Costa, J C

    2014-06-01

    A design of experiments was applied to evaluate different strategies to enhance the methane yield of macroalgae Gracilaria vermiculophylla. Biochemical Methane Potential (BMP) of G. vermiculophylla after physical pre-treatment (washing and maceration) reached 481±9 L CH4 kg(-1) VS, corresponding to a methane yield of 79±2%. No significant effects were achieved in the BMP after thermochemical pre-treatment, although the seaweeds solubilisation increased up to 44%. Co-digestion with glycerol or sewage sludge has proved to be effective for increasing the methane production. Addition of 2% glycerol (w:w) increased the BMP by 18%, achieving almost complete methanation of the substrate (96±3%). Co-digestion of seaweed and secondary sludge (15:85%, TS/TS) increased the BMP by 25% (605±4 L CH4 kg(-1) VS) compared to the seaweed individual digestion.

  15. Design Analysis for Optimal Calibration of Diffusivity in Reactive Multilayers

    CERN Document Server

    Vohra, Manav; Weihs, Timothy P; Knio, Omar M

    2016-01-01

    Calibration of the uncertain Arrhenius diffusion parameters for quantifying mixing rates in Zr-Al nanolaminate foils was performed in a Bayesian setting [Vohra et al., 2014]. The parameters were inferred in a low temperature regime characterized by homogeneous ignition and a high temperature regime characterized by self-propagating reactions in the multilayers. In this work, we extend the analysis to find optimal experimental designs that would provide the best data for inference. We employ a rigorous framework that quantifies the expected in- formation gain in an experiment, and find the optimal design conditions using numerical techniques of Monte Carlo, sparse quadrature, and polynomial chaos surrogates. For the low temperature regime, we find the optimal foil heating rate and pulse duration, and confirm through simulation that the optimal design indeed leads to sharper posterior distributions of the diffusion parameters. For the high temperature regime, we demonstrate potential for increase in the expecte...

  16. Construction of optimal supersaturated designs by the packing method

    Institute of Scientific and Technical Information of China (English)

    FANG; Kaitai; GE; Gennian; LIU; Minqian

    2004-01-01

    A supersaturated design is essentially a factorial design with the equal occurrence of levels property and no fully aliased factors in which the number of main efits potential in factor screening experiments. A packing design is an important object in combinatorial design theory. In this paper, a strong link between the two apparently unrelated kinds of designs is shown. Several criteria for comparing supersaturated designs are proposed, their properties and connections with other existing criteria are discussed.A combinatorial approach, called the packing method, for constructing optimal supersaturated designs is presented, and properties of the resulting designs are also investigated.Comparisons between the new designs and other existing designs are given, which show that our construction method and the newly constructed designs have good properties.

  17. Design analysis for grounding experiments

    NARCIS (Netherlands)

    Lemmen, P.P.M.; Vredeveldt, A.W.; Pinkster, J.A.

    1996-01-01

    In 1995 a series of six grounding experiments has been carried out with a 600 Tonne inland water way tanker. At the how of the vessel test sections could be fitted, which were run into an artificial rock. The design of the support structures for the test sections and for the rock required the predic

  18. Computational Methods for Design, Control and Optimization

    Science.gov (United States)

    2007-10-01

    34scenario" that applies to channel flows ( Poiseuille flows , Couette flow ) and pipe flows . Over the past 75 years many complex "transition theories" have...other areas of flow control, optimization and aerodynamic design. approximate sensitivity calculations and optimization codes. The effort was built on a...for fluid flow problems. The improved robustness and computational efficiency of this approach makes it practical for a wide class of problems. The

  19. Advanced Aerostructural Optimization Techniques for Aircraft Design

    OpenAIRE

    Yingtao Zuo; Pingjian Chen; Lin Fu; Zhenghong Gao; Gang Chen

    2015-01-01

    Traditional coupled aerostructural design optimization (ASDO) of aircraft based on high-fidelity models is computationally expensive and inefficient. To improve the efficiency, the key is to predict aerostructural performance of the aircraft efficiently. The cruise shape of the aircraft is parameterized and optimized in this paper, and a methodology named reverse iteration of structural model (RISM) is adopted to get the aerostructural performance of cruise shape efficiently. A new mathematic...

  20. An optimal design problem in wave propagation

    DEFF Research Database (Denmark)

    Bellido, J.C.; Donoso, Alberto

    2007-01-01

    We consider an optimal design problem in wave propagation proposed in Sigmund and Jensen (Roy. Soc. Lond. Philos. Trans. Ser. A 361:1001-1019, 2003) in the one-dimensional situation: Given two materials at our disposal with different elastic Young modulus and different density, the problem consists...... of finding the best distributions of the two initial materials in a rod in order to minimize the vibration energy in the structure under periodic loading of driving frequency Omega. We comment on relaxation and optimality conditions, and perform numerical simulations of the optimal configurations. We prove...

  1. Dynamic optimization and adaptive controller design

    Science.gov (United States)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  2. Optimal Design of Noisy Transmultiplexer Systems

    Directory of Open Access Journals (Sweden)

    Xie Lihua

    2006-01-01

    Full Text Available An optimal design method for noisy transmultiplexer systems is presented. For a transmultiplexer system with given transmitters and desired crosstalk attenuation, we address the problem of minimizing the reconstruction error while ensuring that the crosstalk of each band is below a prescribed level. By employing the mixed optimization, we will ensure that the system with suboptimal reconstruction error is more robust and less sensitive to the changes of input signals and channel noises. Due to the overlapping of adjacent subchannels, crosstalk between adjacent channels is expected. And the problem of crosstalk attenuation is formulated as an optimization problem, solved in terms of linear matrix inequalities (LMIs. The simulation examples demonstrate that the proposed design performs better than existing design methods.

  3. Novel Optimized Designs for QCA Serial Adders

    Directory of Open Access Journals (Sweden)

    A. Mostafaee

    2017-02-01

    Full Text Available Quantum-dot Cellular Automata (QCA is a new and efficient technology to implement logic Gates and digital circuits at the nanoscale range. In comparison with the conventional CMOS technology, QCA has many attractive features such as: low-power, extremely dense and high speed structures. Adders are the most important part of an arithmetic logic unit (ALU. In this paper, four optimized designs of QCA serial adders are presented. One of the proposed designs is optimized in terms of the number of cells, area and delay without any wire crossing methods. Also, two new designs of QCA serial adders and a QCA layout equivalent to the internal circuit of TM4006 IC are presented. QCADesigner software is used to simulate the proposed designs. Finally, the proposed QCA designs are compared with the previous QCA, CNTFET-based and CMOS technologies.

  4. The optimal design of standard gearsets

    Science.gov (United States)

    Savage, M.; Coy, J. J.; Townsend, D. P.

    1983-01-01

    A design procedure for sizing standard involute spur gearsets is presented. The procedure is applied to find the optimal design for two examples - an external gear mesh with a ratio of 5:1 and an internal gear mesh with a ratio of 5:1. In the procedure, the gear mesh is designed to minimize the center distance for a given gear ratio, pressure angle, pinion torque, and allowable tooth strengths. From the methodology presented, a design space may be formulated for either external gear contact or for internal contact. The design space includes kinematics considerations of involute interference, tip fouling, and contact ratio. Also included are design constraints based on bending fatigue in the pinion fillet and Hertzian contact pressure in the full load region and at the gear tip where scoring is possible. This design space is two dimensional, giving the gear mesh center distance as a function of diametral pitch and the number of pinion teeth. The constraint equations were identified for kinematic interference, fillet bending fatigue, pitting fatigue, and scoring pressure, which define the optimal design space for a given gear design. The locus of equal size optimum designs was identified as the straight line through the origin which has the least slope in the design region.

  5. Solid Rocket Motor Design Using Hybrid Optimization

    Directory of Open Access Journals (Sweden)

    Kevin Albarado

    2012-01-01

    Full Text Available A particle swarm/pattern search hybrid optimizer was used to drive a solid rocket motor modeling code to an optimal solution. The solid motor code models tapered motor geometries using analytical burn back methods by slicing the grain into thin sections along the axial direction. Grains with circular perforated stars, wagon wheels, and dog bones can be considered and multiple tapered sections can be constructed. The hybrid approach to optimization is capable of exploring large areas of the solution space through particle swarming, but is also able to climb “hills” of optimality through gradient based pattern searching. A preliminary method for designing tapered internal geometry as well as tapered outer mold-line geometry is presented. A total of four optimization cases were performed. The first two case studies examines designing motors to match a given regressive-progressive-regressive burn profile. The third case study studies designing a neutrally burning right circular perforated grain (utilizing inner and external geometry tapering. The final case study studies designing a linearly regressive burning profile for right circular perforated (tapered grains.

  6. Optimal Design of Round Bottomed Triangle Channels

    Directory of Open Access Journals (Sweden)

    Ayman T. Hameed

    2013-05-01

    Full Text Available     In optimal design concept, the geometric dimensions of a channel cross-section are determined in a manner to minimize the total construction costs. The Direct search optimization method by using MATALAB is used to solve the resulting channel optimization models for a specified flow rate, roughness coefficient and longitudinal slope. The developed optimization models are applied to design the round bottomed triangle channel and trapezoidal channels to convey a given design flow considering various design scenarios However, it also can be extended to other shapes of channels. This method optimizes the total construction cost by minimizing the cross-sectional area and wetted perimeter per unit length of the channel. In the present study, it is shown that for all values of side slope, the total construction cost in the round bottomed triangle cross-section are less than those of trapezoidal cross-section for the same values of discharge. This indicates that less excavation and a lining are involved and therefore implies that the round bottomed triangle cross-section is more economical than trapezoidal cross-section.

  7. Purification of pre-miR-29 by a new O-phospho-l-tyrosine affinity chromatographic strategy optimized using design of experiments.

    Science.gov (United States)

    Afonso, Adriana; Pereira, Patrícia; Queiroz, João A; Sousa, Ângela; Sousa, Fani

    2014-05-23

    MicroRNAs are the most studied small non-coding RNA molecules that are involved in post-transcriptional regulation of target genes. Their role in Alzheimer's disease is being studied and explored in order to develop a new therapeutic strategy based on specific gene silencing. This disease is characterized by protein deposits, mainly deposits of extracellular Aβ plaques, produced upon endoproteolytic cleavage of APP by ß-site APP-cleaving enzyme 1 (BACE1). Recent studies have shown that particularly miR-29 cluster can be involved in the decrease of Aβ plaques production, by acting on BACE1 expression silencing. In order to use this microRNA as potential therapeutic it is essential to guarantee its purity, stability and integrity. Hence, the main purpose of this study was the development of a new affinity chromatographic strategy by using an O-phospho-l-tyrosine matrix and applying Box-Behnken design (BBD) to obtain pre-miR-29 with high purity degree and yield, envisioning its application in gene therapy. Thus, after process optimization the best results were achieved with a decreasing ammonium sulfate gradient in 10mM Tris buffer, pH 8 (1.6M (NH4)2SO4, 1.11M (NH4)2SO4 and 0M (NH4)2SO4), at 16°C. These experimental conditions allowed the recovery of pre-miR-29 with 52% of purity and 71% of recovery yield. The O-phospho-l-tyrosine matrix was initially chosen to mimic the natural interactions that occur inside the cell, and in fact it was proved a satisfactory selectivity for pre-miR-29. Also the innovative application of BBD for this strategy was efficient (R(2)=0.98 for % relative recovery and R(2)=0.93 for % relative purity) and essential to achieve best purification results in short time, saving lab resources.

  8. OPTIMAL DESIGN OF QUADRATIC SANDWICH PLATE

    Directory of Open Access Journals (Sweden)

    TIMAR Dr. Imre

    2016-05-01

    Full Text Available In this paper, we show the optimal design of the three-layered sandwich plates. The objective function contains the material and fabrication costs. The design constraints are the maximal stresses, the deflection of plates and damping of vibrations. The unknown is the thickness of the filling foam. By the mathematical method, we define the minima of the cost function and the optimal thickness of the filling layer of foam. The active constraint is the deflection, so we calculate of the costs of the sandwich plate with the homogeneous plate.

  9. Experience-Centered Design Designers, Users, and Communities in Dialogue

    CERN Document Server

    Wright, Peter

    2010-01-01

    Experience-centered design, experience-based design, experience design, designing for experience, user experience design. All of these terms have emerged and gained acceptance in the Human-Computer Interaction (HCI) and Interaction Design relatively recently. In this book, we set out our understanding of experience-centered design as a humanistic approach to designing digital technologies and media that enhance lived experience. The book is divided into three sections. In Section 1, we outline the historical origins and basic concepts that led into and flow out from our understanding of experi

  10. Optimal Experimental Design for Large-Scale Bayesian Inverse Problems

    KAUST Repository

    Ghattas, Omar

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  11. Optimization of Lead and Silver Extraction from Zinc Plant Residues in the Presence of Calcium Hypochlorite Using Statistical Design of Experiments

    Science.gov (United States)

    Behnajady, Bahram; Moghaddam, Javad

    2014-12-01

    In this work, a chloride/hypochlorite leaching process was performed for zinc plant residues. Sodium chloride and calcium hypochlorite were used as leaching and oxidizing agents, respectively. Fractional factorial method has been used to test main effects, and interactions among factors were investigated. The statistical software named Design-Expert 7 has been utilized to design experiments and subsequent analysis. Parameters and their levels were reaction time ( t = 16 and 120 minutes), reaction temperature [ T = 303 K and 343 K (30 °C and 70 °C)], solid-to-liquid ratio ( S/ L = 1/6 and 1/38), pH (pH = 0.5 and 2), and Ca(OCl)2 concentration ( C = 0.6 and 3 g/L). Analysis of variance was also employed to determine the relationship between experimental conditions and yield levels. Results showed that reaction temperature and pH were significant parameters for both lead and silver extractions but solid-to-liquid ratio had significant effect only on lead extraction. Increasing pH reduced leaching efficiency of lead and silver. However, increasing reaction temperature promoted the extraction of lead and silver. Ultimate optimum conditions from this study were t 1: 16 min, T 2: 343 K (70 °C), ( S/ L)2: 1/38, pH1: 0.5, and C 1: 0.6 g/L. Under these conditions, extractions of lead and silver were 93.60 and 49.21 pct, respectively.

  12. Shape optimization techniques for musical instrument design

    Science.gov (United States)

    Henrique, Luis; Antunes, Jose; Carvalho, Joao S.

    2002-11-01

    The design of musical instruments is still mostly based on empirical knowledge and costly experimentation. One interesting improvement is the shape optimization of resonating components, given a number of constraints (allowed parameter ranges, shape smoothness, etc.), so that vibrations occur at specified modal frequencies. Each admissible geometrical configuration generates an error between computed eigenfrequencies and the target set. Typically, error surfaces present many local minima, corresponding to suboptimal designs. This difficulty can be overcome using global optimization techniques, such as simulated annealing. However these methods are greedy, concerning the number of function evaluations required. Thus, the computational effort can be unacceptable if complex problems, such as bell optimization, are tackled. Those issues are addressed in this paper, and a method for improving optimization procedures is proposed. Instead of using the local geometric parameters as searched variables, the system geometry is modeled in terms of truncated series of orthogonal space-funcitons, and optimization is performed on their amplitude coefficients. Fourier series and orthogonal polynomials are typical such functions. This technique reduces considerably the number of searched variables, and has a potential for significant computational savings in complex problems. It is illustrated by optimizing the shapes of both current and uncommon marimba bars.

  13. Integrated structural-aerodynamic design optimization

    Science.gov (United States)

    Haftka, R. T.; Kao, P. J.; Grossman, B.; Polen, D.; Sobieszczanski-Sobieski, J.

    1988-01-01

    This paper focuses on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration, with emphasis on the major difficulty associated with multidisciplinary design optimization processes, their enormous computational costs. Methods are presented for reducing this computational burden through the development of efficient methods for cross-sensitivity calculations and the implementation of approximate optimization procedures. Utilizing a modular sensitivity analysis approach, it is shown that the sensitivities can be computed without the expensive calculation of the derivatives of the aerodynamic influence coefficient matrix, and the derivatives of the structural flexibility matrix. The same process is used to efficiently evaluate the sensitivities of the wing divergence constraint, which should be particularly useful, not only in problems of complete integrated aircraft design, but also in aeroelastic tailoring applications.

  14. Heuristic Algorithm in Optimal Discrete Structural Designs

    Directory of Open Access Journals (Sweden)

    Alongkorn Lamom

    2008-01-01

    Full Text Available This study proposes a Heuristic Algorithm for Material Size Selection (HAMSS. It is developed to handle discrete structural optimization problems. The proposed algorithm (HAMSS, Simulated Annealing Algorithm (SA and the conventional design algorithm obtained from a structural steel design software are studied with three selected examples. The HAMSS, in fact, is the adaptation from the traditional SA. Although the SA is one of the easiest optimization algorithms available, a huge number of function evaluations deter its use in structural optimizations. To obtain the optimum answers by the SA, possible answers are first generated randomly. Many of these possible answers are rejected because they do not pass the constraints. To effectively handle this problem, the behavior of optimal structural design problems is incorporated into the algorithm. The new proposed algorithm is called the HAMSS. The efficiency comparison between the SA and the HAMSS is illustrated in term of number of finite element analysis cycles. Results from the study show that HAMSS can significantly reduce the number of structural analysis cycles while the optimized efficiency is not different.

  15. Stented artery biomechanics and device design optimization.

    Science.gov (United States)

    Timmins, Lucas H; Moreno, Michael R; Meyer, Clark A; Criscione, John C; Rachev, Alexander; Moore, James E

    2007-05-01

    The deployment of a vascular stent aims to increase lumen diameter for the restoration of blood flow, but the accompanied alterations in the mechanical environment possibly affect the long-term patency of these devices. The primary aim of this investigation was to develop an algorithm to optimize stent design, allowing for consideration of competing solid mechanical concerns (wall stress, lumen gain, and cyclic deflection). Finite element modeling (FEM) was used to estimate artery wall stress and systolic/diastolic geometries, from which single parameter outputs were derived expressing stress, lumen gain, and cyclic artery wall deflection. An optimization scheme was developed using Lagrangian interpolation elements that sought to minimize the sum of these outputs, with weighting coefficients. Varying the weighting coefficients results in stent designs that prioritize one output over another. The accuracy of the algorithm was confirmed by evaluating the resulting outputs of the optimized geometries using FEM. The capacity of the optimization algorithm to identify optimal geometries and their resulting mechanical measures was retained over a wide range of weighting coefficients. The variety of stent designs identified provides general guidelines that have potential clinical use (i.e., lesion-specific stenting).

  16. Optimizing Your K-5 Engineering Design Challenge

    Science.gov (United States)

    Coppola, Matthew Perkins; Merz, Alice H.

    2017-01-01

    Today, elementary school teachers continue to revisit old lessons and seek out new ones, especially in engineering. Optimization is the process by which an existing product or procedure is revised and refined. Drawn from the authors' experiences working directly with students in grades K-5 and their teachers and preservice teachers, the…

  17. Optimizing Your K-5 Engineering Design Challenge

    Science.gov (United States)

    Coppola, Matthew Perkins; Merz, Alice H.

    2017-01-01

    Today, elementary school teachers continue to revisit old lessons and seek out new ones, especially in engineering. Optimization is the process by which an existing product or procedure is revised and refined. Drawn from the authors' experiences working directly with students in grades K-5 and their teachers and preservice teachers, the…

  18. Optimal design of stiffened composite underwater hulls

    OpenAIRE

    Messager, Tanguy; Chauchot, Pierre; Bigourdan, Benoit

    2006-01-01

    This numerical study deals with the stiffened composite underwater vessel design. The structures under investigation are laminated cylinders with rigid end-closures and inter-nal circumferential and longitudinal unidirectional composite stiffeners. Structural buckling induced by the high external hydrostatic pressure is considered as the major failure risk. An optimization design tool has been developed to obtain the reinforcement definition which maximizes the limit of stability: an analytic...

  19. An Optimization Framework for Product Design

    OpenAIRE

    Leyuan Shi; Sigurdur Ólafsson; Qun Chen

    2001-01-01

    An important problem in the product design and development process is to use the part-worths preferences of potential customers to design a new product such that market share is maximized. The authors present a new optimization framework for this problem, the nested partitions (NP) method. This method is globally convergent and may utilize existing heuristic methods to speed its convergence. We incorporate several known heuristics into this framework and demonstrate through numerical experime...

  20. On Optimal Designs of Some Censoring Schemes

    Directory of Open Access Journals (Sweden)

    Dr. Adnan Mohammad Awad

    2016-03-01

    Full Text Available The main objective of this paper  is to explore suitability of some entropy-information measures for introducing a new optimality censoring criterion and to apply it to some censoring schemes from some underlying life-time models.  In addition, the  paper investigates four related issues namely; the  effect of the parameter of parent distribution on optimal scheme, equivalence of schemes based on Shannon and Awad sup-entropy measures, the conjecture that the optimal scheme is one stage scheme, and  a conjecture by Cramer and Bagh (2011 about Shannon minimum and maximum schemes when parent distribution is reflected power. Guidelines for designing an optimal censoring plane are reported together with theoretical and numerical results and illustrations.

  1. Synthesis and design of optimal biorefinery

    DEFF Research Database (Denmark)

    Cheali, Peam

    of a large numberof alternatives at their optimality. The result is the identification of the optimal rawmaterial, the product (single vs multi) portfolio and the corresponding process technology selection for a given market scenario. The economic risk of investment due to market uncertainties is further...... products from bio-based feedstock. Since there are several bio-basedfeedstock sources, this has motivated development of different conversion concepts producing various desired products. This results in a number of challenges for the synthesis and design of the optimal biorefinery concept at the early...... process feasibility analysis is of a multidisciplinary nature, often limited and uncertain; (iii) Complexity challenge: this problem is complex requiring multi-criteria evaluation (technical, economic,sustainability). This PhD project aims to develop a decision support tool for identifying optimal...

  2. Advanced Aerostructural Optimization Techniques for Aircraft Design

    Directory of Open Access Journals (Sweden)

    Yingtao Zuo

    2015-01-01

    Full Text Available Traditional coupled aerostructural design optimization (ASDO of aircraft based on high-fidelity models is computationally expensive and inefficient. To improve the efficiency, the key is to predict aerostructural performance of the aircraft efficiently. The cruise shape of the aircraft is parameterized and optimized in this paper, and a methodology named reverse iteration of structural model (RISM is adopted to get the aerostructural performance of cruise shape efficiently. A new mathematical explanation of RISM is presented in this paper. The efficiency of RISM can be improved by four times compared with traditional static aeroelastic analysis. General purpose computing on graphical processing units (GPGPU is adopted to accelerate the RISM further, and GPU-accelerated RISM is constructed. The efficiency of GPU-accelerated RISM can be raised by about 239 times compared with that of the loosely coupled aeroelastic analysis. Test shows that the fidelity of GPU-accelerated RISM is high enough for optimization. Optimization framework based on Kriging model is constructed. The efficiency of the proposed optimization system can be improved greatly with the aid of GPU-accelerated RISM. An unmanned aerial vehicle (UAV is optimized using this framework and the range is improved by 4.67% after optimization, which shows effectiveness and efficiency of this framework.

  3. Optimizing the integrated design of boilers - simulation

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus M. S.; Condra, Thomas Joseph

    2004-01-01

    .) it is important to see the 3 components as an integrated unit and optimize these as such. This means that the burner must be designed and optimized exactly to the pressure part where it is utilized, the control system must have a conguration optimal for the pressure part and burner where it is utilized etc....... Traditionally boiler control systems have been designed in a rather simple manner consisting of a feed water controller and a pressure controller; two controllers which, in principle, operated without any interaction - for more details on boiler control see [4]. During the last year Aalborg Industries A/S has...... that are difcult to estimate/calculate have (on the basis of the tests) been determined by means of a least-square data tting, the minimums have been found by means of a Gauss-Newton algorithm and physically veried afterwards. The dynamic boiler model will be applied for developing controllers and adapting...

  4. Design and volume optimization of space structures

    KAUST Repository

    Jiang, Caigui

    2017-07-21

    We study the design and optimization of statically sound and materially efficient space structures constructed by connected beams. We propose a systematic computational framework for the design of space structures that incorporates static soundness, approximation of reference surfaces, boundary alignment, and geometric regularity. To tackle this challenging problem, we first jointly optimize node positions and connectivity through a nonlinear continuous optimization algorithm. Next, with fixed nodes and connectivity, we formulate the assignment of beam cross sections as a mixed-integer programming problem with a bilinear objective function and quadratic constraints. We solve this problem with a novel and practical alternating direction method based on linear programming relaxation. The capability and efficiency of the algorithms and the computational framework are validated by a variety of examples and comparisons.

  5. Design of Multiphase Flow Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Urkedal, Hege

    1998-12-31

    This thesis proposes an experimental design procedure for multiphase experiments. The two-phase functions can be determined using data from a single experiment, while the three-phase relative permeabilities must be determined using data from multiple experiments. Various three-phase experimental designs have been investigated and the accuracy with which the flow functions may be determined using the corresponding data have been computed. Analytical sensitivity coefficients were developed from two-phase to three-phase flow. Sensitivity coefficients are the derivative of the model output with respect to the model parameters. They are obtained by a direct method that takes advantage of the fact that the model equations are solved using the Newton-Raphson method, and some of the results from this solution can be used directly when solving the sensitivity equation. Numerical derivatives are avoided, which improves accuracy. The thesis uses an inverse methodology for determination of two- and three-phase relative permeability and capillary pressure functions. The main work has been the development of analytical sensitivity coefficients for two-and three-phase flow. This technical contribution has improved the accuracy both in parameter estimation and accuracy assessment of the estimates and reduced the computer time requirements. The proposed experimental design is also dependent on accurate sensitivity coefficients to give the right guidelines for how two- and three-phase experiments should be conducted. Following the proposed experimental design, three-phase relative permeability and capillary pressure functions have been estimated when multiple sets of experimental data have been reconciled by simulations. 74 refs., 69 figs., 18 tabs.

  6. Optimization of preservatives in a topical formulation using experimental design.

    Science.gov (United States)

    Rahali, Y; Pensé-Lhéritier, A-M; Mielcarek, C; Bensouda, Y

    2009-12-01

    Optimizing the preservative regime for a preparation requires the antimicrobial effectiveness of several preservative combinations to be determined. In this study, three preservatives were tested: benzoic acid, sorbic acid and benzylic alcohol. Their preservative effects were evaluated using the antimicrobial preservative efficacy test (challenge-test) of the European Pharmacopeia (EP). A D-optimal mixture design was used to provide a maximum of information from a limited number of experiments. The results of this study were analysed with the help of the Design Expert software and enabled us to formulate emulsions satisfying both requirements A and B of the EP.

  7. MDO can help resolve the designer's dilemma. [multidisciplinary design optimization

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw; Tulinius, Jan R.

    1991-01-01

    Multidisciplinary design optimization (MDO) is presented as a rapidly growing body of methods, algorithms, and techniques that will provide a quantum jump in the effectiveness and efficiency of the quantitative side of design, and will turn that side into an environment in which the qualitative side can thrive. MDO borrows from CAD/CAM for graphic visualization of geometrical and numerical data, data base technology, and in computer software and hardware. Expected benefits from this methodology are a rational, mathematically consistent approach to hypersonic aircraft designs, designs pushed closer to the optimum, and a design process either shortened or leaving time available for different concepts to be explored.

  8. FPGA adders: performance evaluation and optimal design

    OpenAIRE

    Xing, S.; Yu, WWH

    1998-01-01

    Delay models and cost analyses developed for ASIC technology are not useful in designing and implementing FPGA devices. The authors discuss costs and operational delays of fixed-point adders on Xilinx 4000 series devices and propose timing models and optimization schemes for carry-skip and carry-select adders.

  9. Design Optimization of Structural Health Monitoring Systems

    Energy Technology Data Exchange (ETDEWEB)

    Flynn, Eric B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-06

    Sensor networks drive decisions. Approach: Design networks to minimize the expected total cost (in a statistical sense, i.e. Bayes Risk) associated with making wrong decisions and with installing maintaining and running the sensor network itself. Search for optimal solutions using Monte-Carlo-Sampling-Adapted Genetic Algorithm. Applications include structural health monitoring and surveillance.

  10. Non-probabilistic Robust Optimal Design Method

    Institute of Scientific and Technical Information of China (English)

    SUN Wei; XU Huanwei; ZHANG Xu

    2009-01-01

    For the purpose of dealing with uncertainty factors in engineering optimization problems, this paper presents a new non-probabilistic robust optimal design method based on maximum variation estimation. The method analyzes the effect of uncertain factors to objective and constraints functions, and then the maximal variations to a solution are calculated. In order to guarantee robust feasibility the maximal variations of constraints are added to original constraints as penalty term; the maximal variation of objective function is taken as a robust index to a solution; linear physical programming is used to adjust the values of quality characteristic and quality variation, and then a bi-level mathematical robust optimal model is coustructed. The method does not require presumed probability distribution of uncertain factors or continuous and differentiable of objective and constraints functions. To demonstrate the proposed method, the design of the two-bar structure acted by concentrated load is presented. In the example the robustness of the normal stress, feasibility of the total volume and the buckling stress are studied. The robust optimal design results show that in the condition of maintaining feasibility robustness, the proposed approach can obtain a robust solution which the designer is satisfied with the value of objective function and its variation.

  11. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    Science.gov (United States)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  12. Optimal design of smart panel using admittance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heung Soo; Kim, Jae Hwan [Inha University, Incheon (Korea, Republic of); Zhao, Lijie [Shenyang Institute of Aeronautical Engineering, Shenyang (China)

    2007-04-15

    Optimal configuration of piezoelectric shunt structures is obtained by analyzing admittance of the system. The dissipated energy in the shunt circuit is a function of admittance. Therefore, admittance was selected as the cost function in the process of optimization. Taguchi method was used to determine the optimal configuration of piezoceramic patch bonded on the host structure. Full three dimensional finite element models were analyzed to simulate vibration modes of smart panel and to obtain the admittances of the system. Numerical admittance was validated by experiment. After optimizing process using admittance, the optimal configuration of piezoceramic patch was obtained. It is observed that the performance of smart panel can be predicted by analyzing admittance of piezoelectric structure and admittance can be used as a design index of smart panel.

  13. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    Science.gov (United States)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design

  14. Circular neighbor-balanced designs universally optimal for total effects

    Institute of Scientific and Technical Information of China (English)

    Ming-yao AI; Gen-nian GE; Ling-yau CHAN

    2007-01-01

    In many experiments, the performance of a subject may be affected by some previous treatments applied to it apart from the current treatment. This motivates the studies of the residual effects of the treatments in a block design. This paper shows that a circular block design neighborbalanced at distances up to γ≤ k- 1, where k is the block size, is universally optimal for total effects under the linear models containing the neighbor effects at distances up to γ among the class of all circular binary block designs. Some combinatorial approaches to constructing these circular block designs neighbor-balanced at distances up to k - 1 are provided.

  15. Circular neighbor-balanced designs universally optimal for total effects

    Institute of Scientific and Technical Information of China (English)

    Ling-yau; CHAN

    2007-01-01

    In many experiments, the performance of a subject may be affected by some previous treatments applied to it apart from the current treatment. This motivates the studies of the residual effects of the treatments in a block design. This paper shows that a circular block design neighbor-balanced at distances up toγ≤k - 1, where k is the block size, is universally optimal for total effects under the linear models containing the neighbor effects at distances up toγamong the class of all circular binary block designs. Some combinatorial approaches to constructing these circular block designs neighbor-balanced at distances up to k - 1 are provided.

  16. Improved Quantum-Inspired Evolutionary Algorithm for Engineering Design Optimization

    Directory of Open Access Journals (Sweden)

    Jinn-Tsong Tsai

    2012-01-01

    Full Text Available An improved quantum-inspired evolutionary algorithm is proposed for solving mixed discrete-continuous nonlinear problems in engineering design. The proposed Latin square quantum-inspired evolutionary algorithm (LSQEA combines Latin squares and quantum-inspired genetic algorithm (QGA. The novel contribution of the proposed LSQEA is the use of a QGA to explore the optimal feasible region in macrospace and the use of a systematic reasoning mechanism of the Latin square to exploit the better solution in microspace. By combining the advantages of exploration and exploitation, the LSQEA provides higher computational efficiency and robustness compared to QGA and real-coded GA when solving global numerical optimization problems with continuous variables. Additionally, the proposed LSQEA approach effectively solves mixed discrete-continuous nonlinear design optimization problems in which the design variables are integers, discrete values, and continuous values. The computational experiments show that the proposed LSQEA approach obtains better results compared to existing methods reported in the literature.

  17. Integrated design by optimization of electrical energy systems

    CERN Document Server

    Roboam, Xavier

    2013-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular integrated optimal design with modeling and optimization methods and tools. It is made up of six chapters dedicated to integrated optimal design. First, the signal processing of mission profiles and system environment variables are discussed. Then, optimization-oriented analytical models, methods and tools (design frameworks) are proposed. A "multi-level optimization" smartly coupling several optimization processes is the subject of one chapter. Finally, a technico-economic optimizatio

  18. Use of statistical design of experiments in the optimization of Ar-O2 low-pressure plasma treatment conditions of polydimethylsiloxane (PDMS) for increasing polarity and adhesion, and inhibiting hydrophobic recovery

    Science.gov (United States)

    Butrón-García, María Isabel; Jofre-Reche, José Antonio; Martín-Martínez, José Miguel

    2015-03-01

    Polydimethylsiloxane (PDMS) film was treated with RF low-pressure plasmas (LPPs) made of mixtures of oxygen and argon for increasing surface polarity, minimizing hydrophobic recovery (i.e. retard ageing) and increasing adhesion to acrylic adhesive tape for medical use. Statistical design of experiments has been used for determining the most influencing experimental parameters of the LPP treatment of PDMS. Water contact angle values (measured 24 h after treatment) and the O/C ratio obtained from XPS experiments were used as response variables. Working pressure was the most influencing parameter in LPP treatment of PDMS, and the duration of the treatment, the power and the oxygen-argon mixture composition determined noticeably its effectiveness. The optimal surface properties in PDMS and inhibited hydrophobic recovery were achieved by treatment with 93 vol% oxygen + 7 vol% argon LLP at low working pressure (300 mTorr), low power (25 W) and long duration of treatment (120 s).

  19. Chaotic Inertia Weight Particle Swarm Optimization for PCR Primer Design

    Directory of Open Access Journals (Sweden)

    Cheng-Huei Yang

    2013-06-01

    Full Text Available In order to provide feasible primer sets for performing a polymerase chain reaction (PCR experiment, many primer design methods have been proposed. However, the majority of these methods require a long time to obtain an optimal solution since large quantities of template DNA need to be analyzed, and the designed primer sets usually do not provide a specific PCR product size. In recent years, particle swarm optimization (PSO has been applied to solve many problems and yielded good results. In this paper, a logistic map is proposed to determine the value of inertia weight of PSO (CIWPSO to design feasible primers. Accuracies for the primer design of the Homo sapiens RNA binding motif protein 11 (RBM11, mRNA (NM_144770, and the Homo sapiens G protein-coupled receptor 78 (GPR78, mRNA (NM_080819 were calculated. Five hundred runs of PSO and the CIWPSO primer design method were performed on different PCR product lengths and the different methods of calculating the melting temperature. A comparison of the accuracy results for PSO and CIWPSO primer design showed that CIWPSO is superior to the PSO for primer design. The proposed method could effectively find optimal or near-optimal primer sets.

  20. Network inference via adaptive optimal design

    NARCIS (Netherlands)

    Stigter, J.D.; Molenaar, J.

    2012-01-01

    Background Current research in network reverse engineering for genetic or metabolic networks very often does not include a proper experimental and/or input design. In this paper we address this issue in more detail and suggest a method that includes an iterative design of experiments based, on the m

  1. Aircraft family design using enhanced collaborative optimization

    Science.gov (United States)

    Roth, Brian Douglas

    Significant progress has been made toward the development of multidisciplinary design optimization (MDO) methods that are well-suited to practical large-scale design problems. However, opportunities exist for further progress. This thesis describes the development of enhanced collaborative optimization (ECO), a new decomposition-based MDO method. To support the development effort, the thesis offers a detailed comparison of two existing MDO methods: collaborative optimization (CO) and analytical target cascading (ATC). This aids in clarifying their function and capabilities, and it provides inspiration for the development of ECO. The ECO method offers several significant contributions. First, it enhances communication between disciplinary design teams while retaining the low-order coupling between them. Second, it provides disciplinary design teams with more authority over the design process. Third, it resolves several troubling computational inefficiencies that are associated with CO. As a result, ECO provides significant computational savings (relative to CO) for the test cases and practical design problems described in this thesis. New aircraft development projects seldom focus on a single set of mission requirements. Rather, a family of aircraft is designed, with each family member tailored to a different set of requirements. This thesis illustrates the application of decomposition-based MDO methods to aircraft family design. This represents a new application area, since MDO methods have traditionally been applied to multidisciplinary problems. ECO offers aircraft family design the same benefits that it affords to multidisciplinary design problems. Namely, it simplifies analysis integration, it provides a means to manage problem complexity, and it enables concurrent design of all family members. In support of aircraft family design, this thesis introduces a new wing structural model with sufficient fidelity to capture the tradeoffs associated with component

  2. Designing Effective Undergraduate Research Experiences

    Science.gov (United States)

    Severson, S.

    2010-12-01

    I present a model for designing student research internships that is informed by the best practices of the Center for Adaptive Optics (CfAO) Professional Development Program. The dual strands of the CfAO education program include: the preparation of early-career scientists and engineers in effective teaching; and changing the learning experiences of students (e.g., undergraduate interns) through inquiry-based "teaching laboratories." This paper will focus on the carry-over of these ideas into the design of laboratory research internships such as the CfAO Mainland internship program as well as NSF REU (Research Experiences for Undergraduates) and senior-thesis or "capstone" research programs. Key ideas in maximizing student learning outcomes and generating productive research during internships include: defining explicit content, scientific process, and attitudinal goals for the project; assessment of student prior knowledge and experience, then following up with formative assessment throughout the project; setting reasonable goals with timetables and addressing motivation; and giving students ownership of the research by implementing aspects of the inquiry process within the internship.

  3. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    Science.gov (United States)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  4. Development and design of experiments optimization of a high temperature proton exchange membrane fuel cell auxiliary power unit with onboard fuel processor

    Science.gov (United States)

    Karstedt, Jörg; Ogrzewalla, Jürgen; Severin, Christopher; Pischinger, Stefan

    In this work, the concept development, system layout, component simulation and the overall DOE system optimization of a HT-PEM fuel cell APU with a net electric power output of 4.5 kW and an onboard methane fuel processor are presented. A highly integrated system layout has been developed that enables fast startup within 7.5 min, a closed system water balance and high fuel processor efficiencies of up to 85% due to the recuperation of the anode offgas burner heat. The integration of the system battery into the load management enhances the transient electric performance and the maximum electric power output of the APU system. Simulation models of the carbon monoxide influence on HT-PEM cell voltage, the concentration and temperature profiles within the autothermal reformer (ATR) and the CO conversion rates within the watergas shift stages (WGSs) have been developed. They enable the optimization of the CO concentration in the anode gas of the fuel cell in order to achieve maximum system efficiencies and an optimized dimensioning of the ATR and WGS reactors. Furthermore a DOE optimization of the global system parameters cathode stoichiometry, anode stoichiometry, air/fuel ratio and steam/carbon ratio of the fuel processing system has been performed in order to achieve maximum system efficiencies for all system operating points under given boundary conditions.

  5. Optimization of straight-sided spline design

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard

    2011-01-01

    and the subject of improving the design. The present paper concentrates on the optimization of splines and the predictions of stress concentrations, which are determined by finite element analysis (FEA). Using different design modifications, that do not change the spline load carrying capacity, it is shown......Spline connection of shaft and hub is commonly applied when large torque capacity is needed together with the possibility of disassembly. The designs of these splines are generally controlled by different standards. In view of the common use of splines, it seems that few papers deal with splines...... that large reductions in the maximum stress are possible. Fatigue life of a spline can be greatly improved with up to a 25% reduction in the maximum stress level. Design modifications are given as simple analytical functions (modified super elliptical shape) with only two active design parameters...

  6. Optimal Control Design with Limited Model Information

    CERN Document Server

    Farokhi, F; Johansson, K H

    2011-01-01

    We introduce the family of limited model information control design methods, which construct controllers by accessing the plant's model in a constrained way, according to a given design graph. We investigate the achievable closed-loop performance of discrete-time linear time-invariant plants under a separable quadratic cost performance measure with structured static state-feedback controllers. We find the optimal control design strategy (in terms of the competitive ratio and domination metrics) when the control designer has access to the local model information and the global interconnection structure of the plant-to-be-controlled. At last, we study the trade-off between the amount of model information exploited by a control design method and the best closed-loop performance (in terms of the competitive ratio) of controllers it can produce.

  7. Automation enhancements in multidisciplinary design optimization

    Science.gov (United States)

    Wujek, Brett Alan

    The process of designing complex systems has necessarily evolved into one which includes the contributions and interactions of multiple disciplines. To date, the Multidisciplinary Design Optimization (MDO) process has been addressed mainly from the standpoint of algorithm development, with the primary concerns being effective and efficient coordination of disciplinary activities, modification of conventional optimization methods, and the utility of approximation techniques toward this goal. The focus of this dissertation is on improving the efficiency of MDO algorithms through the automation of common procedures and the development of improved methods to carry out these procedures. In this research, automation enhancements are made to the MDO process in three different areas: execution, sensitivity analysis and utility, and design variable move-limit management. A framework is developed along with a graphical user interface called NDOPT to automate the setup and execution of MDO algorithms in a research environment. The technology of automatic differentiation (AD) is utilized within various modules of MDO algorithms for fast and accurate sensitivity calculation, allowing for the frequent use of updated sensitivity information. With the use of AD, efficiency improvements are observed in the convergence of system analyses and in certain optimization procedures since gradient-based methods, traditionally considered cost-prohibitive, can be employed at a more reasonable expense. Finally, a method is developed to automatically monitor and adjust design variable move-limits for the approximate optimization process commonly used in MDO algorithms. With its basis in the well established and probably convergent trust region approach, the Trust region Ratio Approximation method (TRAM) developed in this research accounts for approximation accuracy and the sensitivity of the model error to the design space in providing a flexible move-limit adjustment factor. Favorable results

  8. Design Methods and Optimization for Morphing Aircraft

    Science.gov (United States)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  9. Speed Optimization in Liner Shipping Network Design

    DEFF Research Database (Denmark)

    Brouer, Berit Dangaard; Karsten, Christian Vad; Pisinger, David

    In the Liner Shipping Network Design Problem (LSNDP) services sail at a given speed throughout a round trip. In reality most services operate with a speed differentiated head- and back-haul, or even individual speeds on every sailing between two ports. The speed of a service is decisive...... for the bunker consumption in the network as well as the transit time of cargo. Speed optimization has been considered for tramp shipping showing significant reductions in fuel consumption. However, variable speeds has not been considered for post optimization of the LSNDP, where speed optimization could result...... in changes to the cargo flow due to transit time restrictions as well as significant savings in fuel consumption and required vessel deployment due to a weekly frequency requirement. We present a heuristic method to calculate variable speed on a service and present computational results for improving...

  10. Optimizing the integrated design of boilers - simulation

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus M. S.; Condra, Thomas Joseph

    2004-01-01

    .) it is important to see the 3 components as an integrated unit and optimize these as such. This means that the burner must be designed and optimized exactly to the pressure part where it is utilized, the control system must have a conguration optimal for the pressure part and burner where it is utilized etc...... together with Aalborg University and The Technical University of Denmark carried out a project to develop the Model based Multivariable Control System . This is foreseen to be a control system utilizing the continuously increasing computational possibilities to take all the important operation parameters...... formulated as Differential-Algebraic-Equation (DAE) systems. For integration in SIMULINK the models have been index-reduced to Ordinary- Differential-Equation (ODE) systems. The simulations have been carried out by means of the MATLAB/SIMULINK integration routines. For verifying the models developed...

  11. Design of satellite flexibility experiments

    Science.gov (United States)

    Kaplan, M. H.; Hillard, S. E.

    1977-01-01

    A preliminary study has been completed to begin development of a flight experiment to measure spacecraft control/flexible structure interaction. The work reported consists of two phases: identification of appropriate structural parameters which can be associated with flexibility phenomena, and suggestions for the development of an experiment for a satellite configuration typical of near-future vehicles which are sensitive to such effects. Recommendations are made with respect to the type of data to be collected and instrumentation associated with these data. The approach consists of developing the equations of motion for a vehicle possessing a flexible solar array, then linearizing about some nominal motion of the craft. A set of solutions are assumed for array deflection using a continuous normal mode method and important parameters are exposed. Inflight and ground based measurements are distinguished. Interrelationships between these parameters, measurement techniques, and input requirements are discussed which assure minimization of special vehicle maneuvers and optimization of data to be obtained during the normal flight sequence.

  12. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Man, Jun [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  13. OPTIMIZATION STUDIES FOR THE ADVANCED PHOTOINJECTOR EXPERIMENT (APEX)

    Energy Technology Data Exchange (ETDEWEB)

    Lidia, S.M.

    2009-04-30

    The Advanced Photoinjector Experiment (APEX) seeks to validate the design of a proposed high-brightness, normal conducting RF photoinjector gun and bunching cavity feeding a superconducting RF linac to produce nC-scale electron bunches with sub-micron normalized emittances at MHz-scale repetition rates. The beamline design seeks to optimize the slice averaged 6D brightness of the beam prior to injection into a high gradient linac for further manipulation and delivery to an FEL undulator. Details of the proposed beamline layout and electron beam dynamics studies are presented.

  14. Parallel kinematics type, kinematics, and optimal design

    CERN Document Server

    Liu, Xin-Jun

    2014-01-01

    Parallel Kinematics- Type, Kinematics, and Optimal Design presents the results of 15 year's research on parallel mechanisms and parallel kinematics machines. This book covers the systematic classification of parallel mechanisms (PMs) as well as providing a large number of mechanical architectures of PMs available for use in practical applications. It focuses on the kinematic design of parallel robots. One successful application of parallel mechanisms in the field of machine tools, which is also called parallel kinematics machines, has been the emerging trend in advanced machine tools. The book describes not only the main aspects and important topics in parallel kinematics, but also references novel concepts and approaches, i.e. type synthesis based on evolution, performance evaluation and optimization based on screw theory, singularity model taking into account motion and force transmissibility, and others.   This book is intended for researchers, scientists, engineers and postgraduates or above with interes...

  15. Optimized design for an electrothermal microactuator

    Science.gov (United States)

    Cǎlimǎnescu, Ioan; Stan, Liviu-Constantin; Popa, Viorica

    2015-02-01

    In micromechanical structures, electrothermal actuators are known to be capable of providing larger force and reasonable tip deflection compared to electrostatic ones. Many studies have been devoted to the analysis of the flexure actuators. One of the most popular electrothermal actuators is called `U-shaped' actuator. The device is composed of two suspended beams with variable cross sections joined at the free end, which constrains the tip to move in an arcing motion while current is passed through the actuator. The goal of this research is to determine via FEA the best fitted geometry of the microactuator (optimization input parameters) in order to render some of the of the output parameters such as thermal strain or total deformations to their maximum values. The software to generate the CAD geometry was SolidWorks 2010 and all the FEA analysis was conducted with Ansys 13 TM. The optimized model has smaller geometric values of the input parameters that is a more compact geometry; The maximum temperature reached a smaller value for the optimized model; The calculated heat flux is with 13% bigger for the optimized model; the same for Joule Heat (26%), Total deformation (1.2%) and Thermal Strain (8%). By simple optimizing the design the dimensions and the performance of the micro actuator resulted more compact and more efficient.

  16. Design of optimal cyclers using solar sails

    OpenAIRE

    2002-01-01

    Approved for public release; distribution in unlimited. Ongoing interest in establishing a base on Mars has spurred a need for regular and repeated visits to the red planet using a cycling shuttle to transport supplies, equipment and to retrieve surface samples. This thesis presents an approach to designing an optimal heliocentric cycling orbit, or cycler, using solar sa ils. Results show that solar sails can be used to significantly reduce s VÃ at Mars and Earth. For example, using a rea...

  17. Database Design and Management in Engineering Optimization.

    Science.gov (United States)

    1988-02-01

    for 4 Steekanta Murthy, T., Shyy, Y.-K. and Arora, J. S. MIDAS: educational and research purposes. It has considerably Management of Information for...an education in the particular field of ,-". expertise. ..-. *, The types of information to be retained and presented depend on the user of the system...191 . ,. 110 Though the design of MIDAS is directly influenced by Obl- SPOC qUery-bioek the current structural optimization applications, it possesses

  18. Design Optimization of Marine Reduction Gears.

    Science.gov (United States)

    1983-09-01

    Approved by: A t/ 6 𔃼 -A-,i Thesis Advisor Second Reader Chairman,De rtment or Mecanica Engineering I De&n of Science and Engineering 3...unconstrained problems. 1. Direct Methods Direct methods are popular constrained optimization algorithms. One well known direct method is the method of...various popular tooth forms and Appendix A contains a descriptive figure of gear tooth design variables. However, the following equations are a good

  19. Application of Optimal Sinter Burden Design

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The application of the optimal sinter burden design in the sinter shop of No.1 Iron-making Plant in Tangshan Iron & Steel Corp was reported. By using burden calculation and simulating production under different situations, it is demonstrated that the technology can provide the relevant information in product quality and cost etc. for decision-makers. The technology has been used to guide production of the Sinter Shop since 2000, and a remarkable achievement has been obtained.

  20. Optimal experimental design strategies for detecting hormesis

    OpenAIRE

    2010-01-01

    Hormesis is a widely observed phenomenon in many branches of life sciences ranging from toxicology studies to agronomy with obvious public health and risk assessment implications. We address optimal experimental design strategies for determining the presence of hormesis in a controlled environment using the recently proposed Hunt-Bowman model. We propose alternative models that have an implicit hormetic threshold, discuss their advantages over current models, construct and study properties of...

  1. Efficient global optimization of a limited parameter antenna design

    Science.gov (United States)

    O'Donnell, Teresa H.; Southall, Hugh L.; Kaanta, Bryan

    2008-04-01

    Efficient Global Optimization (EGO) is a competent evolutionary algorithm suited for problems with limited design parameters and expensive cost functions. Many electromagnetics problems, including some antenna designs, fall into this class, as complex electromagnetics simulations can take substantial computational effort. This makes simple evolutionary algorithms such as genetic algorithms or particle swarms very time-consuming for design optimization, as many iterations of large populations are usually required. When physical experiments are necessary to perform tradeoffs or determine effects which may not be simulated, use of these algorithms is simply not practical at all due to the large numbers of measurements required. In this paper we first present a brief introduction to the EGO algorithm. We then present the parasitic superdirective two-element array design problem and results obtained by applying EGO to obtain the optimal element separation and operating frequency to maximize the array directivity. We compare these results to both the optimal solution and results obtained by performing a similar optimization using the Nelder-Mead downhill simplex method. Our results indicate that, unlike the Nelder-Mead algorithm, the EGO algorithm did not become stuck in local minima but rather found the area of the correct global minimum. However, our implementation did not always drill down into the precise minimum and the addition of a local search technique seems to be indicated.

  2. Robust Structured Control Design via LMI Optimization

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher; Stoustrup, Jakob

    2011-01-01

    This paper presents a new procedure for discrete-time robust structured control design. Parameter-dependent nonconvex conditions for stabilizable and induced L2-norm performance controllers are solved by an iterative linear matrix inequalities (LMI) optimization. A wide class of controller...... structures including decentralized of any order, fixed-order dynamic output feedback, static output feedback can be designed robust to polytopic uncertainties. Stability is proven by a parameter-dependent Lyapunov function. Numerical examples on robust stability margins shows that the proposed procedure can...

  3. General purpose optimization software for engineering design

    Science.gov (United States)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  4. RESEARCH ON THE PRECISE OPTIMIZATION DESIGN OF APP INTERFACE FOR THE USER'S MICRO EXPERIENCE%用户微体验为诉求的APP界面精准优化设计研究

    Institute of Scientific and Technical Information of China (English)

    于炜; 陶悦

    2016-01-01

    本文研究认为,在思考如何优化用户体验感之前,应从质化研究出发,首先要基于用户的视角和心理去思考产品需要具备什么必备条件。优秀的APP不仅需要在基本使用或实用功能上满足用户,还要在界面设计细微体验上精准设计,打动人心,增强用户交互动作的正向反馈、直接操作的微妙感受。文章以用户体验理论为基础,对APP界面设计进行多维度、多细节的系统切入,对视觉、交互、文字、心理等被忽视的设计细节提出了更高的体验要求。关注界面细节精准优化带来的极致体验,尤其为提升APP用户微体验,一定要以用户的舒适体验和满意使用为诉求进行APP界面的微体验精准化设计。%This study suggests that before thinking about optimizing user experience, we should begin with the qualitative research. Firstly, we should think about the products' necessary conditions needed based on users' perspective and their psychological thinking. A excelent app not only meet users' demand in basic application and practical function, but also need to have precise design in interface design of fine experience, impressing people and enhancing the positive feedback of user interaction action . Based on the theory of user experience, this paper presents the design app interface in multi-dimension and multi-detail system, and provided higher demand to vision, interaction writing and psychology which used to be ignored in the design process. Pay attention to ultimate experience brought by precision and detail, especialy for improve the experience of micro APP, we must take the user experience and satisfaction as the basic point for our APP interface design of precision inspection.

  5. Temperature Rise Optimization Design and Thermal Vacuum Experiment on Coil of Space Manipulator Electromagnetic Brake%空间机械臂电磁制动器温升最优设计及热真空实验

    Institute of Scientific and Technical Information of China (English)

    孙敬颋; 史士财; 陈泓; 刘宏

    2012-01-01

    采用遗传算法几何惩罚函数的方法对空间机械臂制动器电磁线圈温升进行了优化设计.首先针对空间机械臂电磁制动器对电磁力、电流以及磁场强度的限制要求,以温升为目标推导出优化模型.然后针对优化模型约束非线性问题,提出遗传算法结合惩罚函数的优化方法.本方法可在解决全局优化问题的同时保证计算过程中的解总是可行解.优化结果显示,线圈温升大大降低.最后,将制动器置于热真空环境模拟设备中,测出电磁制动器线圈温升曲线.实验结果显示,测得温升值与优化设计得出目标温升值基本吻合,验证了方法及设计的正确性.%Temperature rise optimization design of space manipulator electromagnetic brake coil is done, using a new method which combines genetic algorithm with penalty function. Firstly, optimization model of space manipulator electromagnetic brake coil is obtained with the objective of temperature rise, which considers the restrictions such as electromagnetic force, current and magnetic density. Then, a method combining penalty function with genetic algorithm is proposed to solve the nonlinearity of constraint conditions in optimization model. The method is suitable for global optimization, and guarantees that the solution is always feasible in whole calculation process. The optimization result demonstrates that temperature rise is reduced remarkably. Finally, the brake is put into thermal vacuum environment simulation equipment and temperature rise curve is drawn. The temperature rise measured in the experiment closely meets the objective obtained by the optimization calculation, and this experimental result verifies the correctness of the method and the design.

  6. APPROACH ON INTELLIGENT OPTIMIZATION DESIGN BASED ON COMPOUND KNOWLEDGE

    Institute of Scientific and Technical Information of China (English)

    Yao Jianchu; Zhou Ji; Yu Jun

    2003-01-01

    A concept of an intelligent optimal design approach is proposed, which is organized by a kind of compound knowledge model. The compound knowledge consists of modularized quantitative knowledge, inclusive experience knowledge and case-based sample knowledge. By using this compound knowledge model, the abundant quantity information of mathematical programming and the symbolic knowledge of artificial intelligence can be united together in this model. The intelligent optimal design model based on such a compound knowledge and the automatically generated decomposition principles based on it are also presented. Practically, it is applied to the production planning, process schedule and optimization of production process of a refining & chemical work and a great profit is achieved. Specially, the methods and principles are adaptable not only to continuous process industry, but also to discrete manufacturing one.

  7. Pareto Optimal Design for Synthetic Biology.

    Science.gov (United States)

    Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe

    2015-08-01

    Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.

  8. Precision Optics Optimization for GMp Experiment

    Science.gov (United States)

    Wang, Yang; Allada, Kalyan; Averett, Todd; Christy, Eric; Gu, Chao; Huang, Min; Wojtsekhowski, Bogdan; GMp Collaboration

    2015-04-01

    The GMp experiment aims to improve the precision on the elastic e-p cross section measurement to 2%; up to a factor of 5 better than previous measurements, with four-momentum transfer up to 14 GeV2 using the High Resolution Spectrometers (HRS) of Hall A at Jefferson Lab. These measurements will be an important benchmark for many other cross section measurements in hadron physics. To reach this goal, it is necessary to improve the precision of many instrument systems. Knowledge of the magnetic optics of HRS is critically important for precision reconstruction of the momentum and coordinates of the scattered particles at the interaction vertex. In this talk, an improved optimization method for optics will be presented in detail and the results of a study based on recent commissioning data in 2014 will be discussed.

  9. Social Design Experiments: Toward Equity by Design

    Science.gov (United States)

    Gutiérrez, Kris D.; Jurow, A. Susan

    2016-01-01

    In this article, we advance an approach to design research that is organized around a commitment to transforming the educational and social circumstances of members of non-dominant communities as a means of promoting social equity and learning. We refer to this approach as social design experimentation. The goals of social design experiments…

  10. Selecting the best design for nonstandard toxicology experiments.

    Science.gov (United States)

    Webb, Jennifer M; Smucker, Byran J; Bailer, A John

    2014-10-01

    Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design.

  11. Machine Learning Techniques in Optimal Design

    Science.gov (United States)

    Cerbone, Giuseppe

    1992-01-01

    Many important applications can be formalized as constrained optimization tasks. For example, we are studying the engineering domain of two-dimensional (2-D) structural design. In this task, the goal is to design a structure of minimum weight that bears a set of loads. A solution to a design problem in which there is a single load (L) and two stationary support points (S1 and S2) consists of four members, E1, E2, E3, and E4 that connect the load to the support points is discussed. In principle, optimal solutions to problems of this kind can be found by numerical optimization techniques. However, in practice [Vanderplaats, 1984] these methods are slow and they can produce different local solutions whose quality (ratio to the global optimum) varies with the choice of starting points. Hence, their applicability to real-world problems is severely restricted. To overcome these limitations, we propose to augment numerical optimization by first performing a symbolic compilation stage to produce: (a) objective functions that are faster to evaluate and that depend less on the choice of the starting point and (b) selection rules that associate problem instances to a set of recommended solutions. These goals are accomplished by successive specializations of the problem class and of the associated objective functions. In the end, this process reduces the problem to a collection of independent functions that are fast to evaluate, that can be differentiated symbolically, and that represent smaller regions of the overall search space. However, the specialization process can produce a large number of sub-problems. This is overcome by deriving inductively selection rules which associate problems to small sets of specialized independent sub-problems. Each set of candidate solutions is chosen to minimize a cost function which expresses the tradeoff between the quality of the solution that can be obtained from the sub-problem and the time it takes to produce it. The overall solution

  12. Installation and first operation of the negative ion optimization experiment

    Energy Technology Data Exchange (ETDEWEB)

    De Muri, Michela, E-mail: michela.demuri@igi.cnr.it [INFN-LNL, v.le dell’Università 2, I-35020 Legnaro, PD (Italy); Consorzio RFX, CNR, ENEA, INFN, Università di Padova, A cciaierie Venete SpA – Corso Stati Uniti 4, 35127 Padova (Italy); Cavenago, Marco [INFN-LNL, v.le dell’Università 2, I-35020 Legnaro, PD (Italy); Serianni, Gianluigi; Veltri, Pierluigi; Bigi, Marco; Pasqualotto, Roberto; Barbisan, Marco; Recchia, Mauro; Zaniol, Barbara [Consorzio RFX, CNR, ENEA, INFN, Università di Padova, A cciaierie Venete SpA – Corso Stati Uniti 4, 35127 Padova (Italy); Kulevoy, Timour; Petrenko, Sergey [ITEP, B. Cheremushkinskaya 25, 117218 Moscow (Russian Federation); Baseggio, Lucio; Cervaro, Vannino; Agostini, Fabio Degli; Franchin, Luca; Laterza, Bruno [Consorzio RFX, CNR, ENEA, INFN, Università di Padova, A cciaierie Venete SpA – Corso Stati Uniti 4, 35127 Padova (Italy); Minarello, Alessandro [INFN-LNL, v.le dell’Università 2, I-35020 Legnaro, PD (Italy); Rossetto, Federico [Consorzio RFX, CNR, ENEA, INFN, Università di Padova, A cciaierie Venete SpA – Corso Stati Uniti 4, 35127 Padova (Italy); Sattin, Manuele [INFN-LNL, v.le dell’Università 2, I-35020 Legnaro, PD (Italy); Zucchetti, Simone [Consorzio RFX, CNR, ENEA, INFN, Università di Padova, A cciaierie Venete SpA – Corso Stati Uniti 4, 35127 Padova (Italy)

    2015-10-15

    Highlights: • Negative ion sources are key components of the neutral beam injectors. • The NIO1 experiment is a RF ion source, 60 kV–135 mA hydrogen negative ion beam. • NIO1 can contribute to beam extraction and optics thanks to quick replacement and upgrading of parts. • This work presents installation, status and first experiments results of NIO1. - Abstract: Negative ion sources are key components of the neutral beam injectors for thermonuclear fusion experiments. The NIO1 experiment is a radio frequency ion source generating a 60 kV–135 mA hydrogen negative ion beam. The beam is composed of nine beamlets over an area of about 40 × 40 mm{sup 2}. This experiment is jointly developed by Consorzio RFX and INFN-LNL, with the purpose of providing and optimizing a test ion source, capable of working in continuous mode and in conditions similar to those foreseen for the larger ion sources of the ITER neutral beam injectors. At present research and development activities on these ion sources still address several important issues related to beam extraction and optics optimization, to which the NIO1 test facility can contribute thanks to its modular design, which allows for quick replacement and upgrading of components. This contribution presents the installation phases, the status of the test facility and the results of the first experiments, which have demonstrated that the source can operate in continuous mode.

  13. Design and optimization of tidal turbine airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Grasso, F. [ECN Wind Energy, Petten (Netherlands)

    2012-03-15

    To increase the ratio of energy capture to the loading and, thereby, to reduce cost of energy, the use of specially tailored airfoils is needed. This work is focused on the design of an airfoil for marine application. Firstly, the requirements for this class of airfoils are illustrated and discussed with reference to the requirements for wind turbine airfoils. Then, the design approach is presented. This is a numerical optimization scheme in which a gradient-based algorithm is used, coupled with the RFOIL solver and a composite Bezier geometrical parameterization. A particularly sensitive point is the choice and implementation of constraints .A section of the present work is dedicated to address this point; particular importance is given to the cavitation phenomenon. Finally, a numerical example regarding the design of a high-efficiency hydrofoil is illustrated, and the results are compared with existing turbine airfoils, considering also the effect on turbine performance due to different airfoils.

  14. Strength optimized designs of thermoelastic structures

    DEFF Research Database (Denmark)

    Pedersen, Pauli; Pedersen, Niels Leergaard

    2010-01-01

    For thermoelastic structures the same optimal design does not simultaneously lead to minimum compliance and maximum strength. Compliance may be a questionable objective and focus for the present paper is on the important aspect of strength, quantified as minimization of the maximum von Mises stress...... to mathematical programming, which with a large number of both design variables and strength constraints, is found non-practical, we choose simple recursive iterations to obtain uniform energy density and find by examples that the obtained designs are close to fulfilling also strength maximization. In compliance...... minimization it may be advantageous to decrease the total volume, but for strength maximization it is argued that it is advantageous to keep the total permissible volume. With the thermoelastic analysis presented directly in a finite element formulation, simple explicit formulas for equivalent thermoelastic...

  15. Designing conjoint choice experiments using managers' prior beliefs

    NARCIS (Netherlands)

    Sandor, Z; Wedel, M

    2001-01-01

    The authors provide more efficient designs for conjoint choice experiments based on prior information elicited from managers about the parameters and their associated uncertainty. The authors use a Bayesian design procedure that assumes a prior distribution of likely parameter values and optimizes

  16. Multidisciplinary design optimization for sonic boom mitigation

    Science.gov (United States)

    Ozcer, Isik A.

    product design. The simulation tools are used to optimize three geometries for sonic boom mitigation. The first is a simple axisymmetric shape to be used as a generic nose component, the second is a delta wing with lift, and the third is a real aircraft with nose and wing optimization. The objectives are to minimize the pressure impulse or the peak pressure in the sonic boom signal, while keeping the drag penalty under feasible limits. The design parameters for the meridian profile of the nose shape are the lengths and the half-cone angles of the linear segments that make up the profile. The design parameters for the lifting wing are the dihedral angle, angle of attack, non-linear span-wise twist and camber distribution. The test-bed aircraft is the modified F-5E aircraft built by Northrop Grumman, designated the Shaped Sonic Boom Demonstrator. This aircraft is fitted with an optimized axisymmetric nose, and the wings are optimized to demonstrate optimization for sonic boom mitigation for a real aircraft. The final results predict 42% reduction in bow shock strength, 17% reduction in peak Deltap, 22% reduction in pressure impulse, 10% reduction in foot print size, 24% reduction in inviscid drag, and no loss in lift for the optimized aircraft. Optimization is carried out using response surface methodology, and the design matrices are determined using standard DoE techniques for quadratic response modeling.

  17. Bionic optimization research of soil cultivating component design

    Institute of Scientific and Technical Information of China (English)

    GUO ZhiJun; ZHOU ZhiLi; ZHANG Yi; LI ZhongLi

    2009-01-01

    The basic biomechanical laws that apply to the clawed toes of animals with powerful digging abilities and the optimal bionic design of curved soil cultivating components with an analogous contour were researched in a novel way. First, the curvature and profile of the inside contour line of a field mouse's clawed toe were analyzed. The finite element method (FEM) was then used to simulate the working process in order to study the changing characteristics of the working resistance of bionic soil-engaging surfaces and the stress field of the processed soil. A straight-line cultivating component was used for comparative analysis. In accordance with the simulation results, a series of soil cultivating components of varying design were manufactured. An indoor soil bin experiment was carried out to measure their working resistance and validate the results of the FEM analysis. The results of this research would have important values in the optimization design of cultivating components for energy and cost savings.

  18. Optimal tariff design under consumer self-selection

    Energy Technology Data Exchange (ETDEWEB)

    Raesaenen, M.; Ruusunen, J.; Haemaelaeinen, R.

    1995-12-31

    This report considers the design of electricity tariffs which guides an individual consumer to select the tariff designed for his consumption pattern. In the model the utility maximizes the weighted sum of individual consumers` benefits of electricity consumption subject to the utility`s revenue requirement constraints. The consumers` free choice of tariffs is ensured with the so-called self-selection constraints. The relationship between the consumers` optimal choice of tariffs and the weights in the aggregated consumers` benefit function is analyzed. If such weights exist, they will guarantee both the consumers` optimal choice of tariffs and the efficient consumption patterns. Also the welfare effects are analyzed by using demand parameters estimated from a Finnish dynamic pricing experiment. The results indicate that it is possible to design an efficient tariff menu with the welfare losses caused by the self-selection constraints being small compared with the costs created when some consumers choose tariffs other than assigned for them. (author)

  19. Optimized Design of LED Daylight Lamp Lighting System

    Institute of Scientific and Technical Information of China (English)

    TIAN Da-lei; GUAN Rong-feng; WANG Xing

    2008-01-01

    In order to meet the requirements of indoor illumination, a LED daylight lamp model was designed, it can replace traditional fluorescent lamp without insteading additional power supply establishment. The optical properties of the model were simulated using optical analysis software. Its luminous efficiency is about 30.4 lm/W, and the illuminance is about 38 lux when the distance is 1.5 m between the center of the model and measured spot. With the theoretically-optimized design of LED model, experiments based on the results of the optimal simulation in the laboratory were conducted to verify the performance of the proposed LED model, it reaches a power factor of about 0.8 at 6 W. The simulation results are very similar with the measured values. It is testified that simulative method is one of the effective tools for LED lighting optical design.

  20. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2015-01-01

    in topological optimization: Interactive control and continuous visualization; embedding flexible voids within the design space; consideration of distinct tension / compression properties; and optimization of dual material systems. In extension, optimization procedures for skeletal structures such as trusses...

  1. Numerical design optimization of compressor blade based on ADOP

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    An aerodynamic design optimization platform (ADOP) has been developed. The numerical optimization method is based on genetic algorithm (GA), Pareto ranking and fitness sharing technique. The platform was used for design optimization of the stator of an advanced transonic stage to seek high adiabatic efficiency. The compressor stage efficiency is increased by 0.502% at optimal point and the stall margin is enlarged by nearly 1.0% at design rotating speed. The flow fields of the transonic stage were simulated with FINE/Turbo software package. The optimization result indicates that the optimization platform is effective in 3D numerical design optimization problems.

  2. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  3. Design optimization of a linear actuator

    DEFF Research Database (Denmark)

    Rechenbach, B.; Willatzen, Morten; Preisler, K. Lorenzen

    2013-01-01

    The mechanical contacting of a dielectric elastomer actuator is investigated. The actuator is constructed by coiling the dielectric elastomer around two parallel metal rods, similar to a rubber band stretched by two index fingers. The goal of this paper is to design the geometry and the mechanical...... properties of a polymeric interlayer between the elastomer and the rods, gluing all materials together, so as to optimize the mechanical durability of the system. Finite element analysis is employed for the theoretical study which is linked up to experimental results performed by Danfoss PolyPower A/S....

  4. Multidisciplinary design optimization in computational mechanics

    CERN Document Server

    Breitkopf, Piotr

    2013-01-01

    This book provides a comprehensive introduction to the mathematical and algorithmic methods for the Multidisciplinary Design Optimization (MDO) of complex mechanical systems such as aircraft or car engines. We have focused on the presentation of strategies efficiently and economically managing the different levels of complexity in coupled disciplines (e.g. structure, fluid, thermal, acoustics, etc.), ranging from Reduced Order Models (ROM) to full-scale Finite Element (FE) or Finite Volume (FV) simulations. Particular focus is given to the uncertainty quantification and its impact on the robus

  5. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    Science.gov (United States)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that

  6. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-08

    In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target reaction rate. We show that the expected information gain surface can change its shape dramatically according to the level of noise introduced into the synthetic data. The information that can be extracted from the data saturates as a logarithmic function of the number of experiments, and few experiments are needed when they are conducted at the optimal experimental design conditions.

  7. Handling Qualities Optimization for Rotorcraft Conceptual Design

    Science.gov (United States)

    Lawrence, Ben; Theodore, Colin R.; Berger, Tom

    2016-01-01

    Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.

  8. Design and analysis of experiments with SAS

    CERN Document Server

    Lawson, John

    2010-01-01

    IntroductionStatistics and Data Collection Beginnings of Statistically Planned Experiments Definitions and Preliminaries Purposes of Experimental Design Types of Experimental Designs Planning Experiments Performing the Experiments Use of SAS SoftwareCompletely Randomized Designs with One Factor Introduction Replication and Randomization A Historical Example Linear Model for Completely Randomized Design (CRD) Verifying Assumptions of the Linear Model Analysis Strategies When Assumptions Are Violated Determining the Number of Replicates Comparison of Treatments after the F-TestFactorial Designs

  9. CFD-Based Design Optimization for Single Element Rocket Injector

    Science.gov (United States)

    Vaidyanathan, Rajkumar; Tucker, Kevin; Papila, Nilay; Shyy, Wei

    2003-01-01

    To develop future Reusable Launch Vehicle concepts, we have conducted design optimization for a single element rocket injector, with overall goals of improving reliability and performance while reducing cost. Computational solutions based on the Navier-Stokes equations, finite rate chemistry, and the k-E turbulence closure are generated with design of experiment techniques, and the response surface method is employed as the optimization tool. The design considerations are guided by four design objectives motivated by the consideration in both performance and life, namely, the maximum temperature on the oxidizer post tip, the maximum temperature on the injector face, the adiabatic wall temperature, and the length of the combustion zone. Four design variables are selected, namely, H2 flow angle, H2 and O2 flow areas with fixed flow rates, and O2 post tip thickness. In addition to establishing optimum designs by varying emphasis on the individual objectives, better insight into the interplay between design variables and their impact on the design objectives is gained. The investigation indicates that improvement in performance or life comes at the cost of the other. Best compromise is obtained when improvements in both performance and life are given equal importance.

  10. Modal identification experiment design for large space structures

    Science.gov (United States)

    Kim, Hyoung M.; Doiron, Harold H.

    1991-01-01

    This paper describes an on-orbit modal identification experiment design for large space structures. Space Station Freedom (SSF) systems design definition and structural dynamic models were used as representative large space structures for optimizing experiment design. Important structural modes of study models were selected to provide a guide for experiment design and used to assess the design performance. A pulsed random excitation technique using propulsion jets was developed to identify closely-spaced modes. A measuremenat location selection approach was developed to estimate accurate mode shapes as well as frequencies and damping factors. The data acquisition system and operational scenarios were designed to have minimal impacts on the SSF. A comprehensive simulation was conducted to assess the overall performance of the experiment design.

  11. Optimal patch code design via device characterization

    Science.gov (United States)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  12. Global optimization framework for solar building design

    Science.gov (United States)

    Silva, N.; Alves, N.; Pascoal-Faria, P.

    2017-07-01

    The generative modeling paradigm is a shift from static models to flexible models. It describes a modeling process using functions, methods and operators. The result is an algorithmic description of the construction process. Each evaluation of such an algorithm creates a model instance, which depends on its input parameters (width, height, volume, roof angle, orientation, location). These values are normally chosen according to aesthetic aspects and style. In this study, the model's parameters are automatically generated according to an objective function. A generative model can be optimized according to its parameters, in this way, the best solution for a constrained problem is determined. Besides the establishment of an overall framework design, this work consists on the identification of different building shapes and their main parameters, the creation of an algorithmic description for these main shapes and the formulation of the objective function, respecting a building's energy consumption (solar energy, heating and insulation). Additionally, the conception of an optimization pipeline, combining an energy calculation tool with a geometric scripting engine is presented. The methods developed leads to an automated and optimized 3D shape generation for the projected building (based on the desired conditions and according to specific constrains). The approach proposed will help in the construction of real buildings that account for less energy consumption and for a more sustainable world.

  13. Orthogonal design for scale invariant feature transform optimization

    Science.gov (United States)

    Ding, Xintao; Luo, Yonglong; Yi, Yunyun; Jie, Biao; Wang, Taochun; Bian, Weixin

    2016-09-01

    To improve object recognition capabilities in applications, we used orthogonal design (OD) to choose a group of optimal parameters in the parameter space of scale invariant feature transform (SIFT). In the case of global optimization (GOP) and local optimization (LOP) objectives, our aim is to show the operation of OD on the SIFT method. The GOP aims to increase the number of correctly detected true matches (NoCDTM) and the ratio of NoCDTM to all matches. In contrast, the LOP mainly aims to increase the performance of recall-precision. In detail, we first abstracted the SIFT method to a 9-way fixed-effect model with an interaction. Second, we designed a mixed orthogonal array, MA(64,23420,2), and its header table to optimize the SIFT parameters. Finally, two groups of parameters were obtained for GOP and LOP after orthogonal experiments and statistical analyses were implemented. Our experiments on four groups of data demonstrate that compared with the state-of-the-art methods, GOP can access more correct matches and is more effective against object recognition. In addition, LOP is favorable in terms of the recall-precision.

  14. Optimization of permeability for quality improvement by using factorial design

    Science.gov (United States)

    Said, Rahaini Mohd; Miswan, Nor Hamizah; Juan, Ng Shu; Hussin, Nor Hafizah; Ahmad, Aminah; Kamal, Mohamad Ridzuan Mohamad

    2017-05-01

    Sand castings are used worldwide by the manufacturing process in Metal Casting Industry, whereby the green sand are the commonly used sand mould type in the industry of sand casting. The defects on the surface of casting product is one of the problems in the industry of sand casting. The problems that relates to the defect composition of green sand are such as blowholes, pinholes shrinkage and porosity. Our objective is to optimize the best composition of green sand in order to minimize the occurrence of defects. Sand specimen of difference parameters (Bentonite, Green Sand, Cold dust and water) were design and prepared to undergo permeability test. The 24 factorial design experiment with four factors at difference composition were runs, and the total of 16 runs experiment were conducted. The developed models based on the experimental design necessary models were obtained. The model with a high coefficient of determination (R2=0.9841) and model for predicted and actual fitted well with the experimental data. Using the Analysis of Design Expert software, we identified that bentonite and water are the main interaction effect in the experiments. The optimal settings for green sand composition are 100g silica sand, 21g bentonite, 6.5 g water and 6g coal dust. This composition gives an effect of permeability number 598.3GP.

  15. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Issaid, Chaouki Ben

    2015-01-07

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  16. A surrogate based multistage-multilevel optimization procedure for multidisciplinary design optimization

    NARCIS (Netherlands)

    Yao, W.; Chen, X.; Ouyang, Q.; Van Tooren, M.

    2011-01-01

    Optimization procedure is one of the key techniques to address the computational and organizational complexities of multidisciplinary design optimization (MDO). Motivated by the idea of synthetically exploiting the advantage of multiple existing optimization procedures and meanwhile complying with

  17. A stepwise optimal design of water network☆

    Institute of Scientific and Technical Information of China (English)

    Ying Li⁎; Jintao Guan

    2016-01-01

    In order to take full advantage of regeneration process to reduce fresh water consumption and avoid the accumu-lation of trace contaminants, regeneration reuse and regeneration recycle should be distinctive. A stepwise opti-mal design for water network is developed to simplify solution procedures for the formulated MINLP problem. In this paper, a feasible water reuse network framework is generated. Some heuristic rules from water reuse net-work are used to guide the placement of regeneration process. Then the outlet stream of regeneration process is considered as new water source. Regeneration reuse network structure is obtained through an iterative optimal procedure by taking the insights from reuse water network structure. Furthermore, regeneration recycle is only utilized to eliminate fresh water usage for processes in which regeneration reuse is impossible. Compared with the results obtained by relevant researches for the same example, the present method not only provides an appro-priate regeneration reuse water network with minimum fresh water and regenerated water flow rate but also sug-gests a water network involving regeneration recycle with minimum recycle water flow rate. The design can utilize reuse, regeneration reuse and regeneration recycle step by step with minor water network structure change to achieve better flexibility. It can satisfy different demands for new plants and modernization of existing plants. © 2016 The Chemical Industry and Engineering Society of China, and Chemical Industry Press. Al rights reserved.

  18. Design optimization of functionally graded dental implant.

    Science.gov (United States)

    Hedia, H S; Mahmoud, Nemat-Alla

    2004-01-01

    The continuous increase of man's life span, and the growing confidence in using artificial materials inside the human body necessities introducing more effective prosthesis and implant materials. However, no artificial implant has biomechanical properties equivalent to the original tissue. Recently, titanium and bioceramic materials, such as hydroxyapatite are extensively used as fabrication materials for dental implant due to their high compatibility with hard tissue and living bone. Titanium has reasonable stiffness and strength while hydroxyapatite has low stiffness, low strength and high ability to reach full integration with living bone. In order to obtain good dental implantation of the biomaterial; full integration of the implant with living bone should be satisfied. Minimum stresses in the implant and the bone must be achieved to increase the life of the implant and prevent bone resorption. Therefore, the aim of the current investigation is to design an implant made from functionally graded material (FGM) to achieve the above advantages. The finite element method and optimization technique are used to reach the required implant design. The optimal materials of the FGM dental implant are found to be hydroxyapatite/titanium. The investigations have shown that the maximum stress in the bone for the hydroxyapatite/titanium FGM implant has been reduced by about 22% and 28% compared to currently used titanium and stainless steel dental implants, respectively.

  19. Introducing Experience Goals into Packaging Design

    OpenAIRE

    Joutsela, Markus; Roto, Virpi; Lloyd, Peter; Bohemia, Erik

    2016-01-01

    Consumer experiences are an increasingly important driving force for commerce, affecting also packaging design. Yet, experience design for packages is rarely studied. Specifically, there is a gap in research regarding the integration of experiential goals, Xgoals, into the packaging design process. Open questions include how to describe Xgoals in design briefs when package design is outsourced, how to deal with changes during the design process, and how to evaluate whether the delivered desig...

  20. Adaptive designs for sequential experiments

    Institute of Scientific and Technical Information of China (English)

    林正炎; 张立新

    2003-01-01

    Various adaptive designs have been proposed and applied to clinical trials, bioassay, psychophysics, etc.Adaptive designs are also useful in high cost engineering trials.More and more people have been paying attention to these design methods. This paper introduces several broad families of designs, such as the play-the-winner rule, randomized play-the-winner rule and its generalization to the multi-arm case, doubly biased coin adaptive design, Markov chain model.

  1. Optimality criteria: A basis for multidisciplinary design optimization

    Science.gov (United States)

    Venkayya, V. B.

    1989-01-01

    This paper presents a generalization of what is frequently referred to in the literature as the optimality criteria approach in structural optimization. This generalization includes a unified presentation of the optimality conditions, the Lagrangian multipliers, and the resizing and scaling algorithms in terms of the sensitivity derivatives of the constraint and objective functions. The by-product of this generalization is the derivation of a set of simple nondimensional parameters which provides significant insight into the behavior of the structure as well as the optimization algorithm. A number of important issues, such as, active and passive variables, constraints and three types of linking are discussed in the context of the present derivation of the optimality criteria approach. The formulation as presented in this paper brings multidisciplinary optimization within the purview of this extremely efficient optimality criteria approach.

  2. A Model for Designing Adaptive Laboratory Evolution Experiments.

    Science.gov (United States)

    LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M

    2017-04-15

    The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10(-6.9) to 10(-8.4) mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique.IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized

  3. Performance enhancement of a pump impeller using optimal design method

    Science.gov (United States)

    Jeon, Seok-Yun; Kim, Chul-Kyu; Lee, Sang-Moon; Yoon, Joon-Yong; Jang, Choon-Man

    2017-04-01

    This paper presents the performance evaluation of a regenerative pump to increase its efficiency using optimal design method. Two design parameters which define the shape of the pump impeller, are introduced and analyzed. Pump performance is evaluated by numerical simulation and design of experiments(DOE). To analyze three-dimensional flow field in the pump, general analysis code, CFX, is used in the present work. Shear stress turbulence model is employed to estimate the eddy viscosity. Experimental apparatus with an open-loop facility is set up for measuring the pump performance. Pump performance, efficiency and pressure, obtained from numerical simulation are validated by comparison with the results of experiments. Throughout the shape optimization of the pump impeller at the operating flow condition, the pump efficiency is successfully increased by 3 percent compared to the reference pump. It is noted that the pressure increase of the optimum pump is mainly caused by higher momentum force generated inside blade passage due to the optimal blade shape. Comparisons of pump internal flow on the reference and optimum pump are also investigated and discussed in detail.

  4. Multidisciplinary design optimization of adaptive wing leading edge

    Institute of Scientific and Technical Information of China (English)

    SUN; RuJie; CHEN; GuoPing; ZHOU; Chen; ZHOU; LanWei; JIANG; JinHui

    2013-01-01

    Adaptive wing can significantly enhance aircraft aerodynamic performance, which refers to aerodynamic and structural opti-mization designs. This paper introduces a two-step approach to solve the interrelated problems of the adaptive leading edge. In the first step, the procedure of airfoil optimization is carried out with an initial configuration of NACA 0006. On the basis of the combination of design of experiment (DOE), response surface method (RSM) and genetic algorithm (GA), an adaptive air-foil can be obtained whose lift-to-drag ratio is larger than the baseline airfoil’s at the given angle of attack and subsonic speed.The next step is to design a compliant structure to achieve the target airfoil shape, which is the optimization result of the previous step. In order to minimize the deviation of the deformed shape from the target shape, the load path representation topology method is presented. This method is developed by way of GA, with size and shape optimization incorporated in it simul-taneously. Finally, a comparison study with the Solid Isotropic Material with Penalization (SIMP) method in Altair OptiStruct is conducted, and the results demonstrate the validity and effectiveness of the proposed approach.

  5. Design analysis for optimal calibration of diffusivity in reactive multilayers

    KAUST Repository

    Vohra, Manav

    2017-05-29

    Calibration of the uncertain Arrhenius diffusion parameters for quantifying mixing rates in Zr–Al nanolaminate foils have been previously performed in a Bayesian setting [M. Vohra, J. Winokur, K.R. Overdeep, P. Marcello, T.P. Weihs, and O.M. Knio, Development of a reduced model of formation reactions in Zr–Al nanolaminates, J. Appl. Phys. 116(23) (2014): Article No. 233501]. The parameters were inferred in a low-temperature, homogeneous ignition regime, and a high-temperature self-propagating reaction regime. In this work, we extend the analysis to determine optimal experimental designs that would provide the best data for inference. We employ a rigorous framework that quantifies the expected information gain in an experiment, and find the optimal design conditions using Monte Carlo techniques, sparse quadrature, and polynomial chaos surrogates. For the low-temperature regime, we find the optimal foil heating rate and pulse duration, and confirm through simulation that the optimal design indeed leads to sharp posterior distributions of the diffusion parameters. For the high-temperature regime, we demonstrate the potential for increasing the expected information gain concerning the posteriors by increasing the sample size and reducing the uncertainty in measurements. Moreover, posterior marginals are also obtained to verify favourable experimental scenarios.

  6. Electrostatic afocal-zoom lens design using computer optimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Sise, Omer, E-mail: omersise@gmail.com

    2014-12-15

    Highlights: • We describe the detailed design of a five-element electrostatic afocal-zoom lens. • The simplex optimization is used to optimize lens voltages. • The method can be applied to multi-element electrostatic lenses. - Abstract: Electron optics is the key to the successful operation of electron collision experiments where well designed electrostatic lenses are needed to drive electron beam before and after the collision. In this work, the imaging properties and aberration analysis of an electrostatic afocal-zoom lens design were investigated using a computer optimization technique. We have found a whole new range of voltage combinations that has gone unnoticed until now. A full range of voltage ratios and spherical and chromatic aberration coefficients were systematically analyzed with a range of magnifications between 0.3 and 3.2. The grid-shadow evaluation was also employed to show the effect of spherical aberration. The technique is found to be useful for searching the optimal configuration in a multi-element lens system.

  7. Optimal Design of Automotive Thermoelectric Air Conditioner (TEAC)

    Science.gov (United States)

    Attar, Alaa; Lee, HoSung; Weera, Sean

    2014-06-01

    The present work is an analytical study of the optimal design of an automotive thermoelectric air conditioner (TEAC) using a new optimal design method with dimensional analysis that has been recently developed by our research group. The optimal design gives not only the optimal current but also the optimal geometry (i.e., the number of thermocouples, the geometric factor, or the hot fluid parameters). The optimal design for the TEAC is carried out with two configurations: air-to-liquid and air-to-air heat exchangers.

  8. Fast Bayesian optimal experimental design and its applications

    KAUST Repository

    Long, Quan

    2015-01-07

    We summarize our Laplace method and multilevel method of accelerating the computation of the expected information gain in a Bayesian Optimal Experimental Design (OED). Laplace method is a widely-used method to approximate an integration in statistics. We analyze this method in the context of optimal Bayesian experimental design and extend this method from the classical scenario, where a single dominant mode of the parameters can be completely-determined by the experiment, to the scenarios where a non-informative parametric manifold exists. We show that by carrying out this approximation the estimation of the expected Kullback-Leibler divergence can be significantly accelerated. While Laplace method requires a concentration of measure, multi-level Monte Carlo method can be used to tackle the problem when there is a lack of measure concentration. We show some initial results on this approach. The developed methodologies have been applied to various sensor deployment problems, e.g., impedance tomography and seismic source inversion.

  9. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  10. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    Science.gov (United States)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  11. A quality by design approach to optimization of emulsions for electrospinning using factorial and D-optimal designs.

    Science.gov (United States)

    Badawi, Mariam A; El-Khordagui, Labiba K

    2014-07-16

    Emulsion electrospinning is a multifactorial process used to generate nanofibers loaded with hydrophilic drugs or macromolecules for diverse biomedical applications. Emulsion electrospinnability is greatly impacted by the emulsion pharmaceutical attributes. The aim of this study was to apply a quality by design (QbD) approach based on design of experiments as a risk-based proactive approach to achieve predictable critical quality attributes (CQAs) in w/o emulsions for electrospinning. Polycaprolactone (PCL)-thickened w/o emulsions containing doxycycline HCl were formulated using a Span 60/sodium lauryl sulfate (SLS) emulsifier blend. The identified emulsion CQAs (stability, viscosity and conductivity) were linked with electrospinnability using a 3(3) factorial design to optimize emulsion composition for phase stability and a D-optimal design to optimize stable emulsions for viscosity and conductivity after shifting the design space. The three independent variables, emulsifier blend composition, organic:aqueous phase ratio and polymer concentration, had a significant effect (pquality in electrospinnable emulsions, allowing development of hydrophilic drug-loaded nanofibers with desired morphological characteristics.

  12. Optimality criteria for the design of 2-color microarray studies.

    Science.gov (United States)

    Kerr, Kathleen F

    2012-01-13

    We discuss the definition and application of design criteria for evaluating the efficiency of 2-color microarray designs. First, we point out that design optimality criteria are defined differently for the regression and block design settings. This has caused some confusion in the literature and warrants clarification. Linear models for microarray data analysis have equivalent formulations as ANOVA or regression models. However, this equivalence does not extend to design criteria. We discuss optimality criterion, and argue against applying regression-style D-optimality to the microarray design problem. We further disfavor E- and D-optimality (as defined in block design) because they are not attuned to scientific questions of interest.

  13. CFD-Based Design Optimization Tool Developed for Subsonic Inlet

    Science.gov (United States)

    1995-01-01

    The traditional approach to the design of engine inlets for commercial transport aircraft is a tedious process that ends with a less-than-optimum design. With the advent of high-speed computers and the availability of more accurate and reliable computational fluid dynamics (CFD) solvers, numerical optimization processes can effectively be used to design an aerodynamic inlet lip that enhances engine performance. The designers' experience at Boeing Corporation showed that for a peak Mach number on the inlet surface beyond some upper limit, the performance of the engine degrades excessively. Thus, our objective was to optimize efficiency (minimize the peak Mach number) at maximum cruise without compromising performance at other operating conditions. Using a CFD code NPARC, the NASA Lewis Research Center, in collaboration with Boeing, developed an integrated procedure at Lewis to find the optimum shape of a subsonic inlet lip and a numerical optimization code, ADS. We used a GRAPE-based three-dimensional grid generator to help automate the optimization procedure. The inlet lip shape at the crown and the keel was described as a superellipse, and the superellipse exponents and radii ratios were considered as design variables. Three operating conditions: cruise, takeoff, and rolling takeoff, were considered in this study. Three-dimensional Euler computations were carried out to obtain the flow field. At the initial design, the peak Mach numbers for maximum cruise, takeoff, and rolling takeoff conditions were 0.88, 1.772, and 1.61, respectively. The acceptable upper limits on the takeoff and rolling takeoff Mach numbers were 1.55 and 1.45. Since the initial design provided by Boeing was found to be optimum with respect to the maximum cruise condition, the sum of the peak Mach numbers at takeoff and rolling takeoff were minimized in the current study while the maximum cruise Mach number was constrained to be close to that at the existing design. With this objective, the

  14. Optimal Ground Source Heat Pump System Design

    Energy Technology Data Exchange (ETDEWEB)

    Ozbek, Metin [ENVIRON; Yavuzturk, Cy [University of Hartford; Pinder, George [University of Vermont

    2015-04-15

    Despite the facts that GSHPs first gained popularity as early as the 1940’s and they can achieve 30 to 60 percent in energy savings and carbon emission reductions relative to conventional HVAC systems, the use of geothermal energy in the U.S. has been less than 1 percent of the total energy consumption. The key barriers preventing this technically-mature technology from reaching its full commercial potential have been its high installation cost and limited consumer knowledge and trust in GSHP systems to deliver the technology in a cost-effective manner in the market place. Led by ENVIRON, with support from University Hartford and University of Vermont, the team developed and tested a software-based a decision making tool (‘OptGSHP’) for the least-cost design of ground-source heat pump (‘GSHP’) systems. OptGSHP combines state of the art optimization algorithms with GSHP-specific HVAC and groundwater flow and heat transport simulation. The particular strength of OptGSHP is in integrating heat transport due to groundwater flow into the design, which most of the GSHP designs do not get credit for and therefore are overdesigned.

  15. Optimal Ground Source Heat Pump System Design

    Energy Technology Data Exchange (ETDEWEB)

    Ozbek, Metin [Environ Holdings Inc., Princeton, NJ (United States); Yavuzturk, Cy [Univ. of Hartford, West Hartford, CT (United States); Pinder, George [Univ. of Vermont, Burlington, VT (United States)

    2015-04-01

    Despite the facts that GSHPs first gained popularity as early as the 1940’s and they can achieve 30 to 60 percent in energy savings and carbon emission reductions relative to conventional HVAC systems, the use of geothermal energy in the U.S. has been less than 1 percent of the total energy consumption. The key barriers preventing this technically-mature technology from reaching its full commercial potential have been its high installation cost and limited consumer knowledge and trust in GSHP systems to deliver the technology in a cost-effective manner in the market place. Led by ENVIRON, with support from University Hartford and University of Vermont, the team developed and tested a software-based a decision making tool (‘OptGSHP’) for the least-cost design of ground-source heat pump (‘GSHP’) systems. OptGSHP combines state of the art optimization algorithms with GSHP-specific HVAC and groundwater flow and heat transport simulation. The particular strength of OptGSHP is in integrating heat transport due to groundwater flow into the design, which most of the GSHP designs do not get credit for and therefore are overdesigned.

  16. Chiral metamaterial design using optimized pixelated inclusions with genetic algorithm

    Science.gov (United States)

    Akturk, Cemal; Karaaslan, Muharrem; Ozdemir, Ersin; Ozkaner, Vedat; Dincer, Furkan; Bakir, Mehmet; Ozer, Zafer

    2015-03-01

    Chiral metamaterials have been a research area for many researchers due to their polarization rotation properties on electromagnetic waves. However, most of the proposed chiral metamaterials are designed depending on experience or time-consuming inefficient simulations. A method is investigated for designing a chiral metamaterial with a strong and natural chirality admittance by optimizing a grid of metallic pixels through both sides of a dielectric sheet placed perpendicular to the incident wave by using a genetic algorithm (GA) technique based on finite element method solver. The effective medium parameters are obtained by using constitutive equations and S parameters. The proposed methodology is very efficient for designing a chiral metamaterial with the desired effective medium parameters. By using GA-based topology, it is proven that a chiral metamaterial can be designed and manufactured more easily and with a low cost.

  17. Design of Experiment Using Simulation of a Discrete Dynamical System

    Directory of Open Access Journals (Sweden)

    Mašek Jan

    2016-12-01

    Full Text Available The topic of the presented paper is a promising approach to achieve optimal Design of Experiment (DoE, i.e. spreading of points within a design domain, using a simulation of a discrete dynamical system of interacting particles within an n-dimensional design space. The system of mutually repelling particles represents a physical analogy of the Audze-Eglājs (AE optimization criterion and its periodical modification (PAE, respectively. The paper compares the performance of two approaches to implementation: a single-thread process using the JAVA language environment and a massively parallel solution employing the nVidia CUDA platform.

  18. Optimization of model parameters and experimental designs with the Optimal Experimental Design Toolbox (v1.0) exemplified by sedimentation in salt marshes

    Science.gov (United States)

    Reimer, J.; Schuerch, M.; Slawig, T.

    2015-03-01

    The geosciences are a highly suitable field of application for optimizing model parameters and experimental designs especially because many data are collected. In this paper, the weighted least squares estimator for optimizing model parameters is presented together with its asymptotic properties. A popular approach to optimize experimental designs called local optimal experimental designs is described together with a lesser known approach which takes into account the potential nonlinearity of the model parameters. These two approaches have been combined with two methods to solve their underlying discrete optimization problem. All presented methods were implemented in an open-source MATLAB toolbox called the Optimal Experimental Design Toolbox whose structure and application is described. In numerical experiments, the model parameters and experimental design were optimized using this toolbox. Two existing models for sediment concentration in seawater and sediment accretion on salt marshes of different complexity served as an application example. The advantages and disadvantages of these approaches were compared based on these models. Thanks to optimized experimental designs, the parameters of these models could be determined very accurately with significantly fewer measurements compared to unoptimized experimental designs. The chosen optimization approach played a minor role for the accuracy; therefore, the approach with the least computational effort is recommended.

  19. Research on Multidisciplinary Optimization Design of Bridge Crane

    Directory of Open Access Journals (Sweden)

    Tong Yifei

    2013-01-01

    Full Text Available Bridge crane is one of the most widely used cranes in our country, which is indispensable equipment for material conveying in the modern production. In this paper, the framework of multidisciplinary optimization for bridge crane is proposed. The presented research on crane multidisciplinary design technology for energy saving includes three levels, respectively: metal structures level, transmission design level, and electrical system design level. The shape optimal mathematical model of the crane is established for shape optimization design of metal structure level as well as size optimal mathematical model and topology optimal mathematical model of crane for topology optimization design of metal structure level is established. Finally, system-level multidisciplinary energy-saving optimization design of bridge crane is further carried out with energy-saving transmission design results feedback to energy-saving optimization design of metal structure. The optimization results show that structural optimization design can reduce total mass of crane greatly by using the finite element analysis and multidisciplinary optimization technology premised on the design requirements of cranes such as stiffness and strength; thus, energy-saving design can be achieved.

  20. Earth Observing Satellite Orbit Design Via Particle Swarm Optimization

    Science.gov (United States)

    2014-08-01

    Earth Observing Satellite Orbit Design Via Particle Swarm Optimization Sharon Vtipil ∗ and John G. Warner ∗ US Naval Research Laboratory, Washington...number of passes per day given a satellite’s orbital altitude and inclination. These are used along with particle swarm optimization to determine optimal...well suited to use within a meta-heuristic optimization method such as the Particle Swarm Optimizer (PSO). This method seeks to find the optimal set

  1. Adaptive designs for sequential experiments

    Institute of Scientific and Technical Information of China (English)

    林正炎; 张立新

    2003-01-01

    Various adaptive designe have been proposed and applied to clinical trials,bioassay,psycho-physics,etc.Adaptive designs are also useful in high cost engineering trials.More and More people have been paying attention to these desing methods.This paper introduces several broad families of designs,such as the play-the-winner rele,randomized play-the-winner rule and its generalization to the multi-arm case,doubly bi-ased coin adaptive design,Markov chain model.

  2. Electromagnetic sunscreen model: design of experiments on particle specifications.

    Science.gov (United States)

    Lécureux, Marie; Deumié, Carole; Enoch, Stefan; Sergent, Michelle

    2015-10-01

    We report a numerical study on sunscreen design and optimization. Thanks to the combined use of electromagnetic modeling and design of experiments, we are able to screen the most relevant parameters of mineral filters and to optimize sunscreens. Several electromagnetic modeling methods are used depending on the type of particles, density of particles, etc. Both the sun protection factor (SPF) and the UVB/UVA ratio are considered. We show that the design of experiments' model should include interactions between materials and other parameters. We conclude that the material of the particles is a key parameter for the SPF and the UVB/UVA ratio. Among the materials considered, none is optimal for both. The SPF is also highly dependent on the size of the particles.

  3. Novel Hyper-Redundant Manipulator: Design, Study and Experiment

    Institute of Scientific and Technical Information of China (English)

    李彦明; 马培荪; 秦昌俊; 曹志奎; 王建滨; 朱海鸿

    2003-01-01

    A novel hyper-redundant manipulator named RT1 is designed and studied.The unique feature of RT1 is all degrees of freedom (DOF) are actuated with only one motor via special designed hinge bar universal joints.The mechanisms of RT1 are introduced in detail.Some experiments are carried out in order to test the movability and adaptability of the manipulator.RT1 is actuated by pulse string and acts discretely.The discrete working space of RT1 is described and the parameter optimization for kinematical redundancy resolution is studied also.The optimization criterion is altering the design parameter as little as possible during manipulator's motion from the initial position to the expected position.An[1] optimization example is given that is realized with Matlab optimize tool-box.

  4. 农用超声雾化换能器参数优化设计与试验%Parameter optimization design and experiment of agricultural ultrasonic atomization transducer

    Institute of Scientific and Technical Information of China (English)

    张建桃; 李晟华; 文晟; 兰玉彬; 廖贻泳; 张铁民

    2015-01-01

    diameter of the transducer within the setting range and maximize the atomization flow. In the atomization process, we chose the electrode diameter and the thickness of the ultrasonic vibrator as the design variables, the vibration amplitude of the ultrasonic vibrator as the objective function, and the driving frequency as the constraint condition. Secondly, penalty function was used to solve the optimization problem with inequality constraints. Meanwhile, the modal assurance criteria (MAC) were adopted to recognize the target modals intelligently by ANSYS finite element software. If the value of MAC was closed to 1, the target model was similar to the reference model. This indicated that the vibration along the axial direction was concentrated on the surface of the ultrasonic vibrator and the vibration amplitude was larger than other models. Thirdly, a prototype built based on the optimization results was manufactured to conduct the atomization flow measurement experiment and the droplet diameter measurement experiment. The measured resonant frequency of the optimized transducer was 1.53 MHz, which was very close to the simulated value of ANSYS finite element software (1.62 MHz) and the error was 5.9%. The measured resonant frequencies of the transducer before and after optimization were 1.56 and 1.53 MHz respectively. When the excitation frequency was at the resonant frequency 1.53 MHz, the atomization flow rate of the optimized agricultural ultrasonic atomization transducer reached the maximum. If the excitation frequency of the ultrasonic atomization transducer was lower or higher than the resonant frequency, the atomization flow rate would be reduced, which illustrated that the ultrasonic vibrator should work in the resonant frequency to make the transducer produce the largest amount of aerial fog. The maximal atomization flow rate of the agricultural ultrasonic atomization transducer increased from 1.20 to 1.29 g/min when applying an AC sine-wave voltage whose peak

  5. SPEED design optimization via Fresnel propagation analysis

    Science.gov (United States)

    Beaulieu, Mathilde; Abe, Lyu; Martinez, Patrice; Gouvret, Carole; Dejonghe, Julien; Preis, Oliver; Vakili, Farrokh

    2016-08-01

    Future extremely large telescopes will open a niche for exoplanet direct imaging at the expense of using a primary segmented mirror which is known to hamper high-contrast imaging capabilities. The focal plane diffraction pattern is dominated by bright structures and the way to reduce them is not straightforward since one has to deal with strong amplitude discontinuities in this kind of unfriendly pupil (segment gaps and secondary support). The SPEED experiment developed at Lagrange laboratory is designed to address this specific topic along with high-contrast at very small separation. The baseline design of SPEED will combine a coronagraph and two deformable mirrors to create dark zones at the focal plane. A first step in this project was to identify under which circumstances the deep contrast at small separation is achievable. In particular, the DMs location is among the critical aspect to consider and is the topic covered by this paper.

  6. Designing Experiments for Nonlinear Models - An Introduction

    OpenAIRE

    Johnson, Rachel T.; Montgomery, Douglas C.

    2009-01-01

    The article of record as published may be found at http://dx.doi.org/10.1002/qre.1063 We illustrate the construction of Bayesian D-optimal designs for nonlinear models and compare the relative efficiency of standard designs with these designs for several models and prior distributions on the parameters. Through a relative efficiency analysis, we show that standard designs can perform well in situations where the nonlinear model is intrinsically linear. However, if the model is non...

  7. Chip Design Process Optimization Based on Design Quality Assessment

    Science.gov (United States)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  8. Repair Optimal Erasure Codes through Hadamard Designs

    CERN Document Server

    Papailiopoulos, Dimitris S; Cadambe, Viveck R

    2011-01-01

    In distributed storage systems that employ erasure coding, the issue of minimizing the total {\\it communication} required to exactly rebuild a storage node after a failure arises. This repair bandwidth depends on the structure of the storage code and the repair strategies used to restore the lost data. Designing high-rate maximum-distance separable (MDS) codes that achieve the optimum repair communication has been a well-known open problem. In this work, we use Hadamard matrices to construct the first explicit 2-parity MDS storage code with optimal repair properties for all single node failures, including the parities. Our construction relies on a novel method of achieving perfect interference alignment over finite fields with a finite file size, or number of extensions. We generalize this construction to design $m$-parity MDS codes that achieve the optimum repair communication for single systematic node failures and show that there is an interesting connection between our $m$-parity codes and the systematic-...

  9. Particle Swarm Optimization for Structural Design Problems

    Directory of Open Access Journals (Sweden)

    Hamit SARUHAN

    2010-02-01

    Full Text Available The aim of this paper is to employ the Particle Swarm Optimization (PSO technique to a mechanical engineering design problem which is minimizing the volume of a cantilevered beam subject to bending strength constraints. Mechanical engineering design problems are complex activities which are computing capability are more and more required. The most of these problems are solved by conventional mathematical programming techniques that require gradient information. These techniques have several drawbacks from which the main one is becoming trapped in local optima. As an alternative to gradient-based techniques, the PSO does not require the evaluation of gradients of the objective function. The PSO algorithm employs the generation of guided random positions when they search for the global optimum point. The PSO which is a nature inspired heuristics search technique imitates the social behavior of bird flocking. The results obtained by the PSO are compared with Mathematical Programming (MP. It is demonstrated that the PSO performed and obtained better convergence reliability on the global optimum point than the MP. Using the MP, the volume of 2961000 mm3 was obtained while the beam volume of 2945345 mm3 was obtained by the PSO.

  10. RELATIONAL THEORY APPLICATION FOR OPTIMAL DESIGN OF INTEGRATED CIRCUITS

    Directory of Open Access Journals (Sweden)

    D. V. Demidov

    2014-09-01

    Full Text Available This paper deals with a method of relational theory adaptation for integrated circuits CAD systems. A new algorithm is worked out for optimal search of implicit Don’t Care values for combinational multiple-level digital circuits. The algorithm is described in terms of the adapted relational theory that gives the possibility for a very simple algorithm description for both intuitive understanding and formal analysis. The proposed method makes it possible to apply progressive experience of relational databases in efficient implementation of relational algebra operations (including distributed ones. Comparative analysis of the proposed algorithm and a classic one for optimal search of implicit Don’t Cares is carried out. The analysis has proved formal correctness of the proposed algorithm and its considerably less worst-case complexity. The search of implicit Don’t Care values in the integrated circuits design makes it easier to optimize such characteristics of IC as chip area, power, verifiability and reliability. However, the classic algorithm for optimal search of implicit Don’t Care values is not used in practice due to its very high computational complexity. Application of algorithms for sub-optimal search doesn’t give the possibility to realize the potential of IC optimization to the full. Implementation of the proposed algorithm in IC CAD (a.k.a., EDA systems is adequate due to much lower computational complexity, and potentially makes it possible to improve the quality-development time ratio of IC (chip area, power, verifiability and reliability. Developed method gives the possibility for creation of distributed EDA system with higher computational power and, consequently, for design automation of more complex IC.

  11. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  12. The Study of Tactical Missile's Airframe Digital Optimization Design

    Institute of Scientific and Technical Information of China (English)

    LUO Zhiqing; QIAN Airong; LI Xuefeng; GAO Lin; LEI Jian

    2006-01-01

    Digital design and optimal are very important in modern design. The traditional design methods and procedure are not fit for the modern missile weapons research and development. Digital design methods and optimal ideas were employed to deal with this problem. The disadvantages of the traditional missile's airframe design procedure and the advantages of the digital design methods were discussed. A new concept of design process reengineering (DPR) was put forward. An integrated missile airframe digital design platform and the digital design procedure, which integrated the optimization ideas and methods, were developed. Case study showed that the design platform and the design procedure could improve the efficiency and quality of missile's airframe design, and get the more reasonable and optimal results.

  13. Design and optimization of a brachytherapy robot

    Science.gov (United States)

    Meltsner, Michael A.

    Trans-rectal ultrasound guided (TRUS) low dose rate (LDR) interstitial brachytherapy has become a popular procedure for the treatment of prostate cancer, the most common type of non-skin cancer among men. The current TRUS technique of LDR implantation may result in less than ideal coverage of the tumor with increased risk of negative response such as rectal toxicity and urinary retention. This technique is limited by the skill of the physician performing the implant, the accuracy of needle localization, and the inherent weaknesses of the procedure itself. The treatment may require 100 or more sources and 25 needles, compounding the inaccuracy of the needle localization procedure. A robot designed for prostate brachytherapy may increase the accuracy of needle placement while minimizing the effect of physician technique in the TRUS procedure. Furthermore, a robot may improve associated toxicities by utilizing angled insertions and freeing implantations from constraints applied by the 0.5 cm-spaced template used in the TRUS method. Within our group, Lin et al. have designed a new type of LDR source. The "directional" source is a seed designed to be partially shielded. Thus, a directional, or anisotropic, source does not emit radiation in all directions. The source can be oriented to irradiate cancerous tissues while sparing normal ones. This type of source necessitates a new, highly accurate method for localization in 6 degrees of freedom. A robot is the best way to accomplish this task accurately. The following presentation of work describes the invention and optimization of a new prostate brachytherapy robot that fulfills these goals. Furthermore, some research has been dedicated to the use of the robot to perform needle insertion tasks (brachytherapy, biopsy, RF ablation, etc.) in nearly any other soft tissue in the body. This can be accomplished with the robot combined with automatic, magnetic tracking.

  14. Multidisciplinary Design Optimization of A Human Occupied Vehicle Based on Bi-Level Integrated System Collaborative Optimization

    Institute of Scientific and Technical Information of China (English)

    赵敏; 崔维成; 李翔

    2015-01-01

    The design of Human Occupied Vehicle (HOV) is a typical multidisciplinary problem, but heavily dependent on the experience of naval architects at present engineering design. In order to relieve the experience dependence and improve the design, a new Multidisciplinary Design Optimization (MDO) method “Bi-Level Integrated System Collaborative Optimization (BLISCO)” is applied to the conceptual design of an HOV, which consists of hull module, resistance module, energy module, structure module, weight module, and the stability module. This design problem is defined by 21 design variables and 23 constraints, and its objective is to maximize the ratio of payload to weight. The results show that the general performance of the HOV can be greatly improved by BLISCO.

  15. Experimenting with Science Facility Design.

    Science.gov (United States)

    Butterfield, Eric

    1999-01-01

    Discusses the modern school science facility and how computers and teaching methods are changing their design. Issues include power, lighting, and space requirements; funding for planning; architect assessment; materials requirements for work surfaces; and classroom flexibility. (GR)

  16. Design of an integrated phase frequency detector with optimal power consumption and delay by using particle swarm optimization algorithm

    Directory of Open Access Journals (Sweden)

    zeinab pourtaheri

    2014-10-01

    Full Text Available here is a growing interest in the optimal design of the phase locked loops, because these circuits are widely used in communication and electronic circuits. Undoubtedly the most important objectives in designing PLLs (phase locked loops are low power consumption and low delay. In this paper, the process of designing and the optimization of PFD (one of the main part in PLLs are proposed by using particle swarm optimization (PSO algorithm. In the proposed method, instead of carrying out the frequent experiments and simulations based on trial and error to achieve the desired parameters of the phase frequency detector, effective variables are sent to the PSO algorithm and optimization process is done by this algorithm. The results show a remarkable ability of this heuristic method to find transistors sizing for optimal power consumption and delay.

  17. A Framework for Designing Optimal Spacecraft Formations

    Science.gov (United States)

    2002-09-01

    3 1. Reference Frame ..................................................................................6 B. SOLVING OPTIMAL CONTROL PROBLEMS ........................................7...spacecraft state. Depending on the model, there may be additional variables in the state, but there will be a minimum of these six. B. SOLVING OPTIMAL CONTROL PROBLEMS Until

  18. Prediction uncertainty and optimal experimental design for learning dynamical systems

    Science.gov (United States)

    Letham, Benjamin; Letham, Portia A.; Rudin, Cynthia; Browne, Edward P.

    2016-06-01

    Dynamical systems are frequently used to model biological systems. When these models are fit to data, it is necessary to ascertain the uncertainty in the model fit. Here, we present prediction deviation, a metric of uncertainty that determines the extent to which observed data have constrained the model's predictions. This is accomplished by solving an optimization problem that searches for a pair of models that each provides a good fit for the observed data, yet has maximally different predictions. We develop a method for estimating a priori the impact that additional experiments would have on the prediction deviation, allowing the experimenter to design a set of experiments that would most reduce uncertainty. We use prediction deviation to assess uncertainty in a model of interferon-alpha inhibition of viral infection, and to select a sequence of experiments that reduces this uncertainty. Finally, we prove a theoretical result which shows that prediction deviation provides bounds on the trajectories of the underlying true model. These results show that prediction deviation is a meaningful metric of uncertainty that can be used for optimal experimental design.

  19. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  20. Backbone cup – a structure design competition based on topology optimization and 3D printing

    Directory of Open Access Journals (Sweden)

    Zhu Ji-Hong

    2016-01-01

    Full Text Available This paper addresses a structure design competition based on topology optimization and 3D Printing, and proposes an experimental approach to efficiently and quickly measure the mechanical performance of the structures designed using topology optimization. Since the topology optimized structure designs are prone to be geometrically complex, it is extremely inconvenient to fabricate these designs with traditional machining. In this study, we not only fabricated the topology optimized structure designs using one kind of 3D Printing technology known as stereolithography (SLA, but also tested the mechanical performance of the produced prototype parts. The finite element method is used to analyze the structure responses, and the consistent results of the numerical simulations and structure experiments prove the validity of this new structure testing approach. This new approach will not only provide a rapid access to topology optimized structure designs verifying, but also cut the turnaround time of structure design significantly.

  1. A Robust Optimization Approach Considering the Robustness of Design Objectives and Constraints

    Institute of Scientific and Technical Information of China (English)

    LIUChun-tao; LINZhi-hang; ZHOUChunojing

    2005-01-01

    The problem of robust design is treated as a multi-objective optimization issue in which the performance mean and variation are optimized and minimized respectively, while maintaining the feasibility of design constraints under uncertainty. To effectively address this issue in robust design, this paper presents a novel robust optimization approach which integrates multi-objective optimization concepts with Taguchi's crossed arrays techniques. In this approach,Pareto-optimal robust design solution sets are obtained with the aid of design of experiment set-ups,which utilize the results of Analysis of Variance to quantify relative dominance and significance of design variables. A beam design problem is used to illustrate the effectiveness of the proposed approach.

  2. Application of surrogate-based global optimization to aerodynamic design

    CERN Document Server

    Pérez, Esther

    2016-01-01

    Aerodynamic design, like many other engineering applications, is increasingly relying on computational power. The growing need for multi-disciplinarity and high fidelity in design optimization for industrial applications requires a huge number of repeated simulations in order to find an optimal design candidate. The main drawback is that each simulation can be computationally expensive – this becomes an even bigger issue when used within parametric studies, automated search or optimization loops, which typically may require thousands of analysis evaluations. The core issue of a design-optimization problem is the search process involved. However, when facing complex problems, the high-dimensionality of the design space and the high-multi-modality of the target functions cannot be tackled with standard techniques. In recent years, global optimization using meta-models has been widely applied to design exploration in order to rapidly investigate the design space and find sub-optimal solutions. Indeed, surrogat...

  3. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  4. Design Time Optimization for Hardware Watermarking Protection of HDL Designs

    Directory of Open Access Journals (Sweden)

    E. Castillo

    2015-01-01

    Full Text Available HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.

  5. Design time optimization for hardware watermarking protection of HDL designs.

    Science.gov (United States)

    Castillo, E; Morales, D P; García, A; Parrilla, L; Todorovich, E; Meyer-Baese, U

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.

  6. Superconducting Fault Current Limiter optimized design

    Energy Technology Data Exchange (ETDEWEB)

    Tixador, Pascal, E-mail: Pascal.Tixador@grenoble-inp.fr [Univ. Grenoble Alpes, G2Elab – Institut Néel, F-38000 Grenoble (France); CNRS, G2Elab – Institut Néel, F-38000 Grenoble (France); Badel, Arnaud [CNRS, G2Elab – Institut Néel, F-38000 Grenoble (France)

    2015-11-15

    Highlights: • A low cost design of YBCO Fault Current Limiter. • A high resistance conductor for reduced length. • An asymmetrical YBCO conductor (injection and AC losses). • A thickness suitable for non destructive hot spots. - Abstract: The SuperConducting Fault Current Limiter (SCFCL) appears as one of the most promising SC applications for the electrical grids. Despite its advantages and many successful field experiences the market of SCFCL has difficulties to take off even if the first orders for permanent operation in grids are taken. The analytical design of resistive SCFCL will be discussed with the objective to reduce the quantity of SC conductor (length and section) to be more cost-effective. For that the SC conductor must have a high resistivity in normal state. It can be achieved by using high resistivity alloy for shunt, such as Hastelloy®. One of the most severe constraint is that the SCFCL should operate safely for any faults, especially those with low prospective short-circuit currents. This constraint requires to properly design the thickness of the SC tape in order to limit the hot spot temperature. An operation at 65 K appears as very interesting since it decreases the SC cost at least by a factor 2 with a simple LN2 cryogenics. Taking into account the cost reduction in a near future, the SC conductor cost could be rather low, half a dollar per kV A.

  7. Design of experiment approach for the optimization of polybrominated diphenyl ethers determination in fine airborne particulate matter by microwave-assisted extraction and gas chromatography coupled to tandem mass spectrometry.

    Science.gov (United States)

    Beser, Maria Isabel; Beltrán, Joaquim; Yusà, Vicent

    2014-01-03

    A sensitive and selective procedure for the determination of 12 polybrominated diphenyl ethers (BDE-28, BDE-49, BDE-47, BDE-66, BDE-100, BDE-119, BDE-99, BDE-155, BDE-154, BDE-153, BDE-139 and BDE-183) in airbone particulate matter (PM2.5) at trace level has been developed. The proposed method includes extraction PM2.5-bound PBDEs by microwave-assisted extraction (MAE) followed by gel permeation chromatography (GPC) clean-up and determination by GC-MS/MS using a programmed temperature vaporizer (PTV) in large volume injection (LVI) mode to introduce the sample to the chromatographic system. A design of experiment (DoE) approach was used for the optimization of large volume injection and microwave-assisted extraction parameters to improve these techniques efficiency. Other conditions of the method were studied: GC-MS/MS parameters, extraction solvent and matrix effect. The limit of quantification ranged from 0.063pgm(-3) to 0.210pgm(-3) when air volumes of 723m(3) were collected. Recoveries ranged from 80% to 106%. The method was successfully applied to eight real samples collected from a residential area of the monitoring network of the Region of Valencia Government (Spain) during April-August 2012. BDE-47 and BDE-99 were quantified in six and five samples respectively. The concentrations were ranged from 0.063 to 0.112pgm(-3) for BDE-47, and from 0.107 to 0.212pgm(-3) for BDE-99.

  8. Experiment Design and Analysis Guide - Neutronics & Physics

    Energy Technology Data Exchange (ETDEWEB)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  9. A Powerful Optimization Tool for Analog Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    M. Kubar

    2013-09-01

    Full Text Available This paper presents a new optimization tool for analog circuit design. Proposed tool is based on the robust version of the differential evolution optimization method. Corners of technology, temperature, voltage and current supplies are taken into account during the optimization. That ensures robust resulting circuits. Those circuits usually do not need any schematic change and are ready for the layout.. The newly developed tool is implemented directly to the Cadence design environment to achieve very short setup time of the optimization task. The design automation procedure was enhanced by optimization watchdog feature. It was created to control optimization progress and moreover to reduce the search space to produce better design in shorter time. The optimization algorithm presented in this paper was successfully tested on several design examples.

  10. Design and Optimization Method of a Two-Disk Rotor System

    Science.gov (United States)

    Huang, Jingjing; Zheng, Longxi; Mei, Qing

    2016-04-01

    An integrated analytical method based on multidisciplinary optimization software Isight and general finite element software ANSYS was proposed in this paper. Firstly, a two-disk rotor system was established and the mode, humorous response and transient response at acceleration condition were analyzed with ANSYS. The dynamic characteristics of the two-disk rotor system were achieved. On this basis, the two-disk rotor model was integrated to the multidisciplinary design optimization software Isight. According to the design of experiment (DOE) and the dynamic characteristics, the optimization variables, optimization objectives and constraints were confirmed. After that, the multi-objective design optimization of the transient process was carried out with three different global optimization algorithms including Evolutionary Optimization Algorithm, Multi-Island Genetic Algorithm and Pointer Automatic Optimizer. The optimum position of the two-disk rotor system was obtained at the specified constraints. Meanwhile, the accuracy and calculation numbers of different optimization algorithms were compared. The optimization results indicated that the rotor vibration reached the minimum value and the design efficiency and quality were improved by the multidisciplinary design optimization in the case of meeting the design requirements, which provided the reference to improve the design efficiency and reliability of the aero-engine rotor.

  11. Genetic algorithms for optimal design and control of adaptive structures

    CERN Document Server

    Ribeiro, R; Dias-Rodrigues, J; Vaz, M

    2000-01-01

    Future High Energy Physics experiments require the use of light and stable structures to support their most precise radiation detection elements. These large structures must be light, highly stable, stiff and radiation tolerant in an environment where external vibrations, high radiation levels, material aging, temperature and humidity gradients are not negligible. Unforeseen factors and the unknown result of the coupling of environmental conditions, together with external vibrations, may affect the position stability of the detectors and their support structures compromising their physics performance. Careful optimization of static and dynamic behavior must be an essential part of the engineering design. Genetic Algorithms ( GA) belong to the group of probabilistic algorithms, combining elements of direct and stochastic search. They are more robust than existing directed search methods with the advantage of maintaining a population of potential solutions. There is a class of optimization problems for which Ge...

  12. Design and global optimization of high-efficiency thermophotovoltaic systems.

    Science.gov (United States)

    Bermel, Peter; Ghebrebrhan, Michael; Chan, Walker; Yeng, Yi Xiang; Araghchini, Mohammad; Hamam, Rafif; Marton, Christopher H; Jensen, Klavs F; Soljačić, Marin; Joannopoulos, John D; Johnson, Steven G; Celanovic, Ivan

    2010-09-13

    Despite their great promise, small experimental thermophotovoltaic (TPV) systems at 1000 K generally exhibit extremely low power conversion efficiencies (approximately 1%), due to heat losses such as thermal emission of undesirable mid-wavelength infrared radiation. Photonic crystals (PhC) have the potential to strongly suppress such losses. However, PhC-based designs present a set of non-convex optimization problems requiring efficient objective function evaluation and global optimization algorithms. Both are applied to two example systems: improved micro-TPV generators and solar thermal TPV systems. Micro-TPV reactors experience up to a 27-fold increase in their efficiency and power output; solar thermal TPV systems see an even greater 45-fold increase in their efficiency (exceeding the Shockley-Quiesser limit for a single-junction photovoltaic cell).

  13. Designing the optimal robotic milking barn by applying a queuing network approach

    NARCIS (Netherlands)

    Halachmi, I.; Adan, I.J.B.F.; Wald, van der J.; Beek, van P.; Heesterbeek, J.A.P.

    2003-01-01

    The design of various conventional dairy barns is based on centuries of experience, but there is hardly any experience with robotic milking barns (RMB). Furthermore, as each farmer has his own management practices, the optimal layout is `site dependent¿. A new universally applicable design methodolo

  14. 基于DOE的儿童乘员约束系统约束路径优化%Parameter optimization of CRS' seatbelt constraint paths based on design-of-experiment (DOE)

    Institute of Scientific and Technical Information of China (English)

    韩勇; 谢金萍; 卢晓萍; 王方; 黄红武; 水野幸治

    2015-01-01

    为优化安全带固定式儿童乘员约束系统(CRS)安全带设计,用“试验设计方法(DOE)”,优化约束路径参数。用CRS有限元模型,建立Hybrid III 3岁儿童假人正面碰撞仿真模型,用台车试验结果进行了验证。改进了CRS骨架结构,用正交试验设计,研究安全带导向环固定位置、骨架座椅厚度及儿童乘员与五点式安全带之间的摩擦因数等因素,对头部伤害指标(HIC15)值、胸部合成加速度、胸部垂直加速度等损伤参数的影响。结果表明:头部前倾位移量,从高于法规限值(550 mm)的1.5%,降到低于限值的15%。安全带导向环固定位置是最为敏感的因素;因此,优化导向环固定位置及降低摩擦因数,可减少头部最大位移量。%A method to optimize the parameters of seatbelt constraint paths was built using a design of experiments (DOE) method to improve the safety performance of a type of seat belt ifxed of the child restraint systems (CRS). A front crash simulation model was set-up using an FE model of CRS and a Hybrid III 3-year old child dummy. The simulation results were veriifed with some sled tests. The inlfuences of the seatbelt guide ring positions, the seat skeleton thickness and the friction coefifcient between child and ifve-point harness on injury parameters, such as head injury criterion (HIC15) values, chest resultant acceleration and chest vertical acceleration, were analyzed after improving CRS framework structure, and by using orthogonal experimental design to select parameters. The results show that the head maximum displacement reduces from 1.5% higher than the regulation threshold to 15% lower than the threshold. The seatbelt guide ring position is the most sensitive factor. Therefore, optimizing the seatbelt guide ring position and reducing the friction coefifcient can reduce the maximum displacement of the head.

  15. Configurable intelligent optimization algorithm design and practice in manufacturing

    CERN Document Server

    Tao, Fei; Laili, Yuanjun

    2014-01-01

    Presenting the concept and design and implementation of configurable intelligent optimization algorithms in manufacturing systems, this book provides a new configuration method to optimize manufacturing processes. It provides a comprehensive elaboration of basic intelligent optimization algorithms, and demonstrates how their improvement, hybridization and parallelization can be applied to manufacturing. Furthermore, various applications of these intelligent optimization algorithms are exemplified in detail, chapter by chapter. The intelligent optimization algorithm is not just a single algorit

  16. Optimizing Adhesive Design by Understanding Compliance.

    Science.gov (United States)

    King, Daniel R; Crosby, Alfred J

    2015-12-23

    Adhesives have long been designed around a trade-off between adhesive strength and releasability. Geckos are of interest because they are the largest organisms which are able to climb utilizing adhesive toepads, yet can controllably release from surfaces and perform this action over and over again. Attempting to replicate the hierarchical, nanoscopic features which cover their toepads has been the primary focus of the adhesives field until recently. A new approach based on a scaling relation which states that reversible adhesive force capacity scales with (A/C)(1/2), where A is the area of contact and C is the compliance of the adhesive, has enabled the creation of high strength, reversible adhesives without requiring high aspect ratio, fibrillar features. Here we introduce an equation to calculate the compliance of adhesives, and utilize this equation to predict the shear adhesive force capacity of the adhesive based on the material components and geometric properties. Using this equation, we have investigated important geometric parameters which control force capacity and have shown that by controlling adhesive shape, adhesive force capacity can be increased by over 50% without varying pad size. Furthermore, we have demonstrated that compliance of the adhesive far from the interface still influences shear adhesive force capacity. Utilizing this equation will allow for the production of adhesives which are optimized for specific applications in commercial and industrial settings.

  17. Using Animal Instincts to Design Efficient Biomedical Studies via Particle Swarm Optimization.

    Science.gov (United States)

    Qiu, Jiaheng; Chen, Ray-Bing; Wang, Weichung; Wong, Weng Kee

    2014-10-01

    Particle swarm optimization (PSO) is an increasingly popular metaheuristic algorithm for solving complex optimization problems. Its popularity is due to its repeated successes in finding an optimum or a near optimal solution for problems in many applied disciplines. The algorithm makes no assumption of the function to be optimized and for biomedical experiments like those presented here, PSO typically finds the optimal solutions in a few seconds of CPU time on a garden-variety laptop. We apply PSO to find various types of optimal designs for several problems in the biological sciences and compare PSO performance relative to the differential evolution algorithm, another popular metaheuristic algorithm in the engineering literature.

  18. Optimization Design and Application of Underground Reinforced Concrete Bifurcation Pipe

    Directory of Open Access Journals (Sweden)

    Chao Su

    2015-01-01

    Full Text Available Underground reinforced concrete bifurcation pipe is an important part of conveyance structure. During construction, the workload of excavation and concrete pouring can be significantly decreased according to optimized pipe structure, and the engineering quality can be improved. This paper presents an optimization mathematical model of underground reinforced concrete bifurcation pipe structure according to real working status of several common pipe structures from real cases. Then, an optimization design system was developed based on Particle Swarm Optimization algorithm. Furthermore, take the bifurcation pipe of one hydropower station as an example: optimization analysis was conducted, and accuracy and stability of the optimization design system were verified successfully.

  19. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2014-01-01

    This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...... in topological optimization: Interactive control and continuous visualization; embedding flexible voids within the design space; consideration of distinct tension / compression properties; and optimization of dual material systems. In extension, optimization procedures for skeletal structures such as trusses...... and frames are implemented. The developed procedures allow for the exploration of new territories in optimization of architectural structures, and offer new methodological strategies for bridging conceptual gaps between optimization and architectural practice....

  20. Involving Motion Graphics in Spatial Experience Design

    DEFF Research Database (Denmark)

    Steijn, Arthur

    2013-01-01

    process, and therefore it should be constructed as such. Since the development of the design model has this double focus, I involve design students in design laboratories related to my practice as a teacher in visual communication design and production design. I also reflect on how an initial design...... conceptualization of various design elements in an analysis of the way in which these elements are integrated and used in the creation of particular experiences of space, atmosphere and artistic expression. On the basis of this analysis I present a preliminary construction of a design model including some design...... laboratory may be set-up and used. Finally I describe how this design lab can be used as a method for creating both practical knowledge of the manner in which the design model may be applied in actual design processes, as well as how these processes can be used for pointing out design elements or dimensions...

  1. An Efficient Method for Reliability-based Multidisciplinary Design Optimization

    Institute of Scientific and Technical Information of China (English)

    Fan Hui; Li Weiji

    2008-01-01

    Design for modem engineering system is becoming multidisciplinary and incorporates practical uncertainties; therefore, it is necessary to synthesize reliability analysis and the multidiscipLinary design optimization (MDO) techniques for the design of complex engineering system. An advanced first order second moment method-based concurrent subspace optimization approach is proposed based on the comparison and analysis of the existing multidisciplinary optimization techniques and the reliability analysis methods. It is seen through a canard configuration optimization for a three-surface transport that the proposed method is computationally efficient and practical with the least modification to the current deterministic optimization process.

  2. Topology optimization problems with design-dependent sets of constraints

    DEFF Research Database (Denmark)

    Schou, Marie-Louise Højlund

    Topology optimization is a design tool which is used in numerous fields. It can be used whenever the design is driven by weight and strength considerations. The basic concept of topology optimization is the interpretation of partial differential equation coefficients as effective material...... structural topology optimization problems. For such problems a stress constraint for an element should only be present in the optimization problem when the structural design variable corresponding to this element has a value greater than zero. We model the stress constrained topology optimization problem...... using both discrete and continuous design variables. Using discrete design variables is the natural modeling frame. However, we cannot solve real-size problems with the technological limits of today. Using continuous design variables makes it possible to also study topology optimization problems...

  3. Optimization design of blade shapes for wind turbines

    DEFF Research Database (Denmark)

    Chen, Jin; Wang, Xudong; Shen, Wen Zhong

    2010-01-01

    For the optimization design of wind turbines, the new normal and tangential induced factors of wind turbines are given considering the tip loss of the normal and tangential forces based on the blade element momentum theory and traditional aerodynamic model. The cost model of the wind turbines...... and the optimization design model are developed. In the optimization model, the objective is the minimum cost of energy and the design variables are the chord length, twist angle and the relative thickness. Finally, the optimization is carried out for a 2 MW blade by using this optimization design model....... The performance of blades is validated through the comparison and analysis of the results. The reduced cost shows that the optimization model is good enough for the design of wind turbines. The results give a proof for the design and research on the blades of large scale wind turbines and also establish...

  4. Fuel cell cathode air filters: Methodologies for design and optimization

    Science.gov (United States)

    Kennedy, Daniel M.; Cahela, Donald R.; Zhu, Wenhua H.; Westrom, Kenneth C.; Nelms, R. Mark; Tatarchuk, Bruce J.

    Proton exchange membrane (PEM) fuel cells experience performance degradation, such as reduction in efficiency and life, as a result of poisoning of platinum catalysts by airborne contaminants. Research on these contaminant effects suggests that the best possible solution to allowing fuel cells to operate in contaminated environments is by filtration of the harmful contaminants from the cathode air. A cathode air filter design methodology was created that connects properties of cathode air stream, filter design options, and filter footprint, to a set of adsorptive filter parameters that must be optimized to efficiently operate the fuel cell. Filter optimization requires a study of the trade off between two causal factors of power loss: first, a reduction in power production due to poisoning of the platinum catalyst by chemical contaminants and second, an increase in power requirements to operate the air compressor with a larger pressure drop from additional contaminant filtration. The design methodology was successfully applied to a 1.2 kW fuel cell using a programmable algorithm and predictions were made about the relationships between inlet concentration, breakthrough time, filter design, pressure drop, and compressor power requirements.

  5. Design of experiments in production engineering

    CERN Document Server

    2016-01-01

    This book covers design of experiments (DoE) applied in production engineering as a combination of manufacturing technology with applied management science. It presents recent research advances and applications of design experiments in production engineering and the chapters cover metal cutting tools, soft computing for modelling and optmization of machining, waterjet machining of high performance ceramics, among others.

  6. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  7. Advances in metaheuristic algorithms for optimal design of structures

    CERN Document Server

    Kaveh, A

    2014-01-01

    This book presents efficient metaheuristic algorithms for optimal design of structures. Many of these algorithms are developed by the author and his colleagues, consisting of Democratic Particle Swarm Optimization, Charged System Search, Magnetic Charged System Search, Field of Forces Optimization, Dolphin Echolocation Optimization, Colliding Bodies Optimization, Ray Optimization. These are presented together with algorithms which were developed by other authors and have been successfully applied to various optimization problems. These consist of Particle Swarm Optimization, Big Bang-Big Crunch Algorithm, Cuckoo Search Optimization, Imperialist Competitive Algorithm, and Chaos Embedded Metaheuristic Algorithms. Finally a multi-objective optimization method is presented to solve large-scale structural problems based on the Charged System Search algorithm. The concepts and algorithms presented in this book are not only applicable to optimization of skeletal structures and finite element models, but can equally ...

  8. Advances in metaheuristic algorithms for optimal design of structures

    CERN Document Server

    Kaveh, A

    2017-01-01

    This book presents efficient metaheuristic algorithms for optimal design of structures. Many of these algorithms are developed by the author and his colleagues, consisting of Democratic Particle Swarm Optimization, Charged System Search, Magnetic Charged System Search, Field of Forces Optimization, Dolphin Echolocation Optimization, Colliding Bodies Optimization, Ray Optimization. These are presented together with algorithms which were developed by other authors and have been successfully applied to various optimization problems. These consist of Particle Swarm Optimization, Big Bang-Big Crunch Algorithm, Cuckoo Search Optimization, Imperialist Competitive Algorithm, and Chaos Embedded Metaheuristic Algorithms. Finally a multi-objective optimization method is presented to solve large-scale structural problems based on the Charged System Search algorithm. The concepts and algorithms presented in this book are not only applicable to optimization of skeletal structures and finite element models, but can equally ...

  9. Smashing UX design foundations for designing online user experiences

    CERN Document Server

    Allen, Jesmond

    2012-01-01

    The ultimate guide to UX from the world's most popular resource for web designers and developers Smashing Magazine is the world's most popular resource for web designers and developers and with this book, the authors provide the pinnacle resource to becoming savvy with User Experience Design (UX). The authors first provide an overview of UX and chart its rise to becoming a valuable and necessary practice for narrowing the gap between Web sites, applications, and users in order to make a user's experience a happy, easy, and successful one.Examines the essential aspects of User Experience Design

  10. Superconducting Fault Current Limiter optimized design

    Science.gov (United States)

    Tixador, Pascal; Badel, Arnaud

    2015-11-01

    The SuperConducting Fault Current Limiter (SCFCL) appears as one of the most promising SC applications for the electrical grids. Despite its advantages and many successful field experiences the market of SCFCL has difficulties to take off even if the first orders for permanent operation in grids are taken. The analytical design of resistive SCFCL will be discussed with the objective to reduce the quantity of SC conductor (length and section) to be more cost-effective. For that the SC conductor must have a high resistivity in normal state. It can be achieved by using high resistivity alloy for shunt, such as Hastelloy®. One of the most severe constraint is that the SCFCL should operate safely for any faults, especially those with low prospective short-circuit currents. This constraint requires to properly design the thickness of the SC tape in order to limit the hot spot temperature. An operation at 65 K appears as very interesting since it decreases the SC cost at least by a factor 2 with a simple LN2 cryogenics. Taking into account the cost reduction in a near future, the SC conductor cost could be rather low, half a dollar per kV A.

  11. An optimization method for metamorphic mechanisms based on multidisciplinary design optimization

    Directory of Open Access Journals (Sweden)

    Zhang Wuxiang

    2014-12-01

    Full Text Available The optimization of metamorphic mechanisms is different from that of the conventional mechanisms for its characteristics of multi-configuration. There exist complex coupled design variables and constraints in its multiple different configuration optimization models. To achieve the compatible optimized results of these coupled design variables, an optimization method for metamorphic mechanisms is developed in the paper based on the principle of multidisciplinary design optimization (MDO. Firstly, the optimization characteristics of the metamorphic mechanism are summarized distinctly by proposing the classification of design variables and constraints as well as coupling interactions among its different configuration optimization models. Further, collaborative optimization technique which is used in MDO is adopted for achieving the overall optimization performance. The whole optimization process is then proposed by constructing a two-level hierarchical scheme with global optimizer and configuration optimizer loops. The method is demonstrated by optimizing a planar five-bar metamorphic mechanism which has two configurations, and results show that it can achieve coordinated optimization results for the same parameters in different configuration optimization models.

  12. An optimization method for metamorphic mechanisms based on multidisciplinary design optimization

    Institute of Scientific and Technical Information of China (English)

    Zhang Wuxiang; Wu Teng; Ding Xilun

    2014-01-01

    The optimization of metamorphic mechanisms is different from that of the conventional mechanisms for its characteristics of multi-configuration. There exist complex coupled design vari-ables and constraints in its multiple different configuration optimization models. To achieve the compatible optimized results of these coupled design variables, an optimization method for meta-morphic mechanisms is developed in the paper based on the principle of multidisciplinary design optimization (MDO). Firstly, the optimization characteristics of the metamorphic mechanism are summarized distinctly by proposing the classification of design variables and constraints as well as coupling interactions among its different configuration optimization models. Further, collabora-tive optimization technique which is used in MDO is adopted for achieving the overall optimization performance. The whole optimization process is then proposed by constructing a two-level hierar-chical scheme with global optimizer and configuration optimizer loops. The method is demon-strated by optimizing a planar five-bar metamorphic mechanism which has two configurations, and results show that it can achieve coordinated optimization results for the same parameters in different configuration optimization models.

  13. Chemical optimization algorithm for fuzzy controller design

    CERN Document Server

    Astudillo, Leslie; Castillo, Oscar

    2014-01-01

    In this book, a novel optimization method inspired by a paradigm from nature is introduced. The chemical reactions are used as a paradigm to propose an optimization method that simulates these natural processes. The proposed algorithm is described in detail and then a set of typical complex benchmark functions is used to evaluate the performance of the algorithm. Simulation results show that the proposed optimization algorithm can outperform other methods in a set of benchmark functions. This chemical reaction optimization paradigm is also applied to solve the tracking problem for the dynamic model of a unicycle mobile robot by integrating a kinematic and a torque controller based on fuzzy logic theory. Computer simulations are presented confirming that this optimization paradigm is able to outperform other optimization techniques applied to this particular robot application

  14. Fast Bayesian optimal experimental design for seismic source inversion

    KAUST Repository

    Long, Quan

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem. © 2015 Elsevier B.V.

  15. Fast Bayesian Optimal Experimental Design for Seismic Source Inversion

    KAUST Repository

    Long, Quan

    2016-01-06

    We develop a fast method for optimally designing experiments [1] in the context of statistical seismic source inversion [2]. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by the elastic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the true parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  16. Optimization of minoxidil microemulsions using fractional factorial design approach.

    Science.gov (United States)

    Jaipakdee, Napaphak; Limpongsa, Ekapol; Pongjanyakul, Thaned

    2016-01-01

    The objective of this study was to apply fractional factorial and multi-response optimization designs using desirability function approach for developing topical microemulsions. Minoxidil (MX) was used as a model drug. Limonene was used as an oil phase. Based on solubility, Tween 20 and caprylocaproyl polyoxyl-8 glycerides were selected as surfactants, propylene glycol and ethanol were selected as co-solvent in aqueous phase. Experiments were performed according to a two-level fractional factorial design to evaluate the effects of independent variables: Tween 20 concentration in surfactant system (X1), surfactant concentration (X2), ethanol concentration in co-solvent system (X3), limonene concentration (X4) on MX solubility (Y1), permeation flux (Y2), lag time (Y3), deposition (Y4) of MX microemulsions. It was found that Y1 increased with increasing X3 and decreasing X2, X4; whereas Y2 increased with decreasing X1, X2 and increasing X3. While Y3 was not affected by these variables, Y4 increased with decreasing X1, X2. Three regression equations were obtained and calculated for predicted values of responses Y1, Y2 and Y4. The predicted values matched experimental values reasonably well with high determination coefficient. By using optimal desirability function, optimized microemulsion demonstrating the highest MX solubility, permeation flux and skin deposition was confirmed as low level of X1, X2 and X4 but high level of X3.

  17. Optimized Solution of Two Bar Truss Design Using Intuitionistic Fuzzy Optimization Technique

    Directory of Open Access Journals (Sweden)

    Samir Dey

    2014-08-01

    Full Text Available The main goal of the structural optimization is to minimize the weight of structure or the vertical deflection of loaded joint while satisfying all design requirements imposed by design codes. In general fuzzy sets are used to analyze the fuzzy structural optimization. In this paper, a planer truss structural model in intuitionistic fuzzy environment has been developed. This paper proposes an intuitionistic fuzzy optimization approach to solve a non-linear programming problem in the context of a structural application. This approximation approach is used to solve structural optimization model with weight as objective function. This intuitionistic fuzzy optimization (IFO approach is illustrated on two-bar truss structural design problem. The result of the intuitionistic fuzzy optimization obtained is compared with the other results of optimization algorithms from the literary sources. It is shown that the proposed intuitionistic fuzzy optimization approach is more efficient than the analogous fuzzy technique for structural design.

  18. Comparison of optimal design methods in inverse problems

    Science.gov (United States)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  19. PARAMETER COORDINATION AND ROBUST OPTIMIZATION FOR MULTIDISCIPLINARY DESIGN

    Institute of Scientific and Technical Information of China (English)

    HU Jie; PENG Yinghong; XIONG Guangleng

    2006-01-01

    A new parameter coordination and robust optimization approach for multidisciplinary design is presented. Firstly, the constraints network model is established to support engineering change, coordination and optimization. In this model, interval boxes are adopted to describe the uncertainty of design parameters quantitatively to enhance the design robustness. Secondly, the parameter coordination method is presented to solve the constraints network model, monitor the potential conflicts due to engineering changes, and obtain the consistency solution space corresponding to the given product specifications. Finally, the robust parameter optimization model is established, and genetic arithmetic is used to obtain the robust optimization parameter. An example of bogie design is analyzed to show the scheme to be effective.

  20. Design and analysis of experiments

    CERN Document Server

    Hinkelmann, Klaus

    This book discusses special modifications and extensions of designs that arise in certain fields of application such as genetics, bioinformatics, agriculture, medicine, manufacturing, marketing, etc. Well-known and highly-regarded contributors have written individual chapters that have been extensively reviewed by the Editor to ensure that each individual contribution relates to material found in Volumes 1 and 2 of this book series. The chapters in Volume 3 have an introductory/historical component and proceed to a more advanced technical level to discuss the latest results and future developm

  1. Application of the optimal Latin hypercube design and radial basis function network to collaborative optimization

    Institute of Scientific and Technical Information of China (English)

    ZHAO Min; CUI Wei-cheng

    2007-01-01

    Improving the efficiency of ship optimization is crucial for modern ship design. Compared with traditional methods, multidisciplinary design optimization (MDO) is a more promising approach. For this reason, Collaborative Optimization (CO) is discussed and analyzed in this paper. As one of the most frequently applied MDO methods, CO promotes autonomy of disciplines while providing a coordinating mechanism guaranteeing progress toward an optimum and maintaining interdisciplinary compatibility. However, there are some difficulties in applying the conventional CO method, such as difficulties in choosing an initial point and tremendous computational requirements. For the purpose of overcoming these problems, optimal Latin hypercube design and Radial basis function network were applied to CO. Optimal Latin hypercube design is a modified Latin Hypercube design. Radial basis function network approximates the optimization model, and is updated during the optimization process to improve accuracy. It is shown by examples that the computing efficiency and robustness of this CO method are higher than with the conventional CO method.

  2. Democratic design experiments: between parliament and laboratory

    DEFF Research Database (Denmark)

    Binder, Thomas; Brandt, Eva; Ehn, Pelle;

    2015-01-01

    been performed and accomplished in participatory practices. In this article we discuss how participatory design may be reinvigorated as a design research programme for democratic design experiments in the light of the de-centring of human-centredness and the foregrounding of collaborative......For more than four decades participatory design has provided exemplars and concepts for understanding the democratic potential of design participation. Despite important impacts on design methodology participatory design has however been stuck in a marginal position as it has wrestled with what has...

  3. Optimization of transmission system design based on genetic algorithm

    Directory of Open Access Journals (Sweden)

    Xianbing Chen

    2016-05-01

    Full Text Available Transmission system is a crucial precision mechanism for twin-screw chemi-mechanical pulping equipment. The structure of the system designed by traditional method is not optimal because the structure designed by the traditional methods is easy to fall into the local optimum. To achieve the global optimum, this article applies the genetic algorithm which has grown in recent years in the field of structure optimization. The article uses the volume of transmission system as the objective function to optimize the structure designed by traditional method. Compared to the simulation results, the original structure is not optimal, and the optimized structure is tighter and more reasonable. Based on the optimized results, the transmission shafts in the transmission system are designed and checked, and the parameters of the twin screw are selected and calculated. The article provided an effective method to design the structure of transmission system.

  4. Optimization Methods From Theory to Design Scientific and Technological Aspects in Mechanics

    CERN Document Server

    Cavazzuti, Marco

    2013-01-01

    This book is about optimization techniques and is subdivided into two parts. In the first part a wide overview on optimization theory is presented. Optimization is presented as being composed of five topics, namely: design of experiment, response surface modeling, deterministic optimization, stochastic optimization, and robust engineering design. Each chapter, after presenting the main techniques for each part, draws application oriented conclusions including didactic examples. In the second part some applications are presented to guide the reader through the process of setting up a few optimization exercises, analyzing critically the choices which are made step by step, and showing how the different topics that constitute the optimization theory can be used jointly in an optimization process. The applications which are presented are mainly in the field of thermodynamics and fluid dynamics due to the author's background.

  5. Trajectory Optimization Design for Morphing Wing Missile

    Institute of Scientific and Technical Information of China (English)

    Ruisheng Sun; Chao Ming; Chuanjie Sun

    2015-01-01

    This paper presents a new particle swarm optimization ( PSO) algorithm to optimize the trajectory of morphing⁃wing missile so as to achieve the enlargement of the maximum range. Equations of motion for the two⁃dimensional dynamics are derived by treating the missile as an ideal controllable mass point. An investigation of aerodynamic characteristics of morphing⁃wing missile with varying geometries is performed. After deducing the optimizing trajectory model for maximizing range, a type of discrete method is put forward for taking optimization control problem into nonlinear dynamic programming problem. The optimal trajectory is solved by using PSO algorithm and penalty function method. The simulation results suggest that morphing⁃wing missile has the larger range than the fixed⁃shape missile when launched at supersonic speed, while morphing⁃wing missile has no obvious range increment than the fixed⁃shape missile at subsonic speed.

  6. Integrated Reliability-Based Optimal Design of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1987-01-01

    the reliability decreases with time it is often necessary to design an inspection and repair programme. For example the reliability of offshore steel structures decreases with time due to corrosion and development of fatigue cracks. Until now most inspection and repair strategies are based on experience rather......In conventional optimal design of structural systems the weight or the initial cost of the structure is usually used as objective function. Further, the constraints require that the stresses and/or strains at some critical points have to be less than some given values. Finally, all variables...... and parameters are assumed to be deterministic quantities. In this paper a probabilistic formulation is used. Some of the quantities specifying the load and the strength of the structure are modelled as random variables, and the constraints specify that the reliability of the structure has to exceed some given...

  7. Eye tracking in user experience design

    CERN Document Server

    Romano Bergstorm, Jennifer

    2014-01-01

    Eye Tracking for User Experience Design explores the many applications of eye tracking to better understand how users view and interact with technology. Ten leading experts in eye tracking discuss how they have taken advantage of this new technology to understand, design, and evaluate user experience. Real-world stories are included from these experts who have used eye tracking during the design and development of products ranging from information websites to immersive games. They also explore recent advances in the technology which tracks how users interact with mobile devices, large-screen displays and video game consoles. Methods for combining eye tracking with other research techniques for a more holistic understanding of the user experience are discussed. This is an invaluable resource to those who want to learn how eye tracking can be used to better understand and design for their users. * Includes highly relevant examples and information for those who perform user research and design interactive experi...

  8. The Ethics of User Experience Design

    DEFF Research Database (Denmark)

    Vistisen, Peter; Jensen, Thessa

    actions to experience the system, and thus deal with the problem. The way these actions are related to the way the user is viewed by the designer, will in this article be discussed with the term empathy as its fulcrum. Empathy has been heralded as the primary skill for the user-centered designer to ensure......-centered design process. Exemplifying the differences and ethical implications for the designer in the interaction with the user through the design of interactive digital systems. Finally the article discusses the need to understand design as a development of empathy for a given user or group of users by giving...... a cased-based overview of how empathy can be achieved during the design process, and become the catalyst of a more ethical approach to designing the user experience of ICT....

  9. The Ethics of User Experience Design

    DEFF Research Database (Denmark)

    Vistisen, Peter; Jensen, Thessa

    2013-01-01

    actions to experience the system, and thus deal with the problem. The way these actions are related to the way the user is viewed by the designer, will in this article be discussed with the term empathy as its fulcrum. Empathy has been heralded as the primary skill for the user-centered designer to ensure......-centered design process. Exemplifying the differences and ethical implications for the designer in the interaction with the user through the design of interactive digital systems. Finally the article discusses the need to understand design as a development of empathy for a given user or group of users by giving...... a cased-based overview of how empathy can be achieved during the design process, and become the catalyst of a more ethical approach to designing the user experience of ICT....

  10. Optimal Design of a Center Support Quadruple Mass Gyroscope (CSQMG

    Directory of Open Access Journals (Sweden)

    Tian Zhang

    2016-04-01

    Full Text Available This paper reports a more complete description of the design process of the Center Support Quadruple Mass Gyroscope (CSQMG, a gyro expected to provide breakthrough performance for flat structures. The operation of the CSQMG is based on four lumped masses in a circumferential symmetric distribution, oscillating in anti-phase motion, and providing differential signal extraction. With its 4-fold symmetrical axes pattern, the CSQMG achieves a similar operation mode to Hemispherical Resonant Gyroscopes (HRGs. Compared to the conventional flat design, four Y-shaped coupling beams are used in this new pattern in order to adjust mode distribution and enhance the synchronization mechanism of operation modes. For the purpose of obtaining the optimal design of the CSQMG, a kind of applicative optimization flow is developed with a comprehensive derivation of the operation mode coordination, the pseudo mode inhibition, and the lumped mass twisting motion elimination. The experimental characterization of the CSQMG was performed at room temperature, and the center operation frequency is 6.8 kHz after tuning. Experiments show an Allan variance stability 0.12°/h (@100 s and a white noise level about 0.72°/h/√Hz, which means that the CSQMG possesses great potential to achieve navigation grade performance.

  11. Bionic optimization research of soil cultivating component design

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The basic biomechanical laws that apply to the clawed toes of animals with powerful digging abilities and the optimal bionic design of curved soil cultivating components with an analogous contour were researched in a novel way. First, the curvature and profile of the inside contour line of a field mouse’s clawed toe were analyzed. The finite element method (FEM) was then used to simulate the working process in order to study the changing characteristics of the working resistance of bionic soil- engag- ing surfaces and the stress field of the processed soil. A straight-line cultivating component was used for comparative analysis. In accordance with the simulation results, a series of soil cultivating com- ponents of varying design were manufactured. An indoor soil bin experiment was carried out to meas- ure their working resistance and validate the results of the FEM analysis. The results of this research would have important values in the optimization design of cultivating components for energy and cost savings.

  12. OPTIMAL WAVELET FILTER DESIGN FOR REMOTE SENSING IMAGE COMPRESSION

    Institute of Scientific and Technical Information of China (English)

    Yang Guoan; Zheng Nanning; Guo Shugang

    2007-01-01

    A new approach for designing the Biorthogonal Wavelet Filter Bank (BWFB) for the purpose of image compression is presented in this letter. The approach is decomposed into two steps.First, an optimal filter bank is designed in theoretical sense based on Vaidyanathan's coding gain criterion in SubBand Coding (SBC) system. Then the above filter bank is optimized based on the criterion of Peak Signal-to-Noise Ratio (PSNR) in JPEG2000 image compression system, resulting in a BWFB in practical application sense. With the approach, a series of BWFB for a specific class of applications related to image compression, such as remote sensing images, can be fast designed. Here,new 5/3 BWFB and 9/7 BWFB are presented based on the above approach for the remote sensing image compression applications. Experiments show that the two filter banks are equally performed with respect to CDF 9/7 and LT 5/3 filter in JPEG2000 standard; at the same time, the coefficients and the lifting parameters of the lifting scheme are all rational, which bring the computational advantage, and the ease for VLSI implementation.

  13. Optimal Design of a Center Support Quadruple Mass Gyroscope (CSQMG).

    Science.gov (United States)

    Zhang, Tian; Zhou, Bin; Yin, Peng; Chen, Zhiyong; Zhang, Rong

    2016-01-01

    This paper reports a more complete description of the design process of the Center Support Quadruple Mass Gyroscope (CSQMG), a gyro expected to provide breakthrough performance for flat structures. The operation of the CSQMG is based on four lumped masses in a circumferential symmetric distribution, oscillating in anti-phase motion, and providing differential signal extraction. With its 4-fold symmetrical axes pattern, the CSQMG achieves a similar operation mode to Hemispherical Resonant Gyroscopes (HRGs). Compared to the conventional flat design, four Y-shaped coupling beams are used in this new pattern in order to adjust mode distribution and enhance the synchronization mechanism of operation modes. For the purpose of obtaining the optimal design of the CSQMG, a kind of applicative optimization flow is developed with a comprehensive derivation of the operation mode coordination, the pseudo mode inhibition, and the lumped mass twisting motion elimination. The experimental characterization of the CSQMG was performed at room temperature, and the center operation frequency is 6.8 kHz after tuning. Experiments show an Allan variance stability 0.12°/h (@100 s) and a white noise level about 0.72°/h/√Hz, which means that the CSQMG possesses great potential to achieve navigation grade performance.

  14. On CAD-integrated Structural Topology and Design Optimization

    DEFF Research Database (Denmark)

    Olhoff, Niels; Bendsøe, M.P.; Rasmussen, John

    1991-01-01

    Concepts underlying an interactive CAD-based engineering design optimization system are developed, and methods of optimizing the topology, shape and sizing of mechanical components are presented. These methods are integrated in the system, and the method for determining the optimal topology is used...

  15. On CAD-integrated Structural Topology and Design Optimization

    DEFF Research Database (Denmark)

    Olhoff, Niels; Bendsøe, M.P.; Rasmussen, John

    1991-01-01

    Concepts underlying an interactive CAD-based engineering design optimization system are developed, and methods of optimizing the topology, shape and sizing of mechanical components are presented. These methods are integrated in the system, and the method for determining the optimal topology is used...

  16. Spaceflight payload design flight experience G-408

    Science.gov (United States)

    Durgin, William W.; Looft, Fred J.; Sacco, Albert, Jr.; Thompson, Robert; Dixon, Anthony G.; Roberti, Dino; Labonte, Robert; Moschini, Larry

    1992-01-01

    Worcester Polytechnic Institute's first payload of spaceflight experiments flew aboard Columbia, STS-40, during June of 1991 and culminated eight years of work by students and faculty. The Get Away Special (GAS) payload was installed on the GAS bridge assembly at the aft end of the cargo bay behind the Spacelab Life Sciences (SLS-1) laboratory. The Experiments were turned on by astronaut signal after reaching orbit and then functioned for 72 hours. Environmental and experimental measurements were recorded on three cassette tapes which, together with zeolite crystals grown on orbit, formed the basis of subsequent analyses. The experiments were developed over a number of years by undergraduate students meeting their project requirements for graduation. The experiments included zeolite crystal growth, fluid behavior, and microgravity acceleration measurement in addition to environmental data acquisition. Preparation also included structural design, thermal design, payload integration, and experiment control. All of the experiments functioned on orbit and the payload system performed within design estimates.

  17. An Optimal Design Model for New Water Distribution Networks in ...

    African Journals Online (AJOL)

    An Optimal Design Model for New Water Distribution Networks in Kigali City. ... a Linear Programming Problem (LPP) which involves the design of a new network of water distribution considering the cost in the form of unit price ... Article Metrics.

  18. AERODYNAMIC OPTIMIZATION DESIGN OF LOW ASPECT RATIO TRANSONIC TURBINE STAGE

    Institute of Scientific and Technical Information of China (English)

    SONG Liming; LI Jun; FENG Zhenping

    2006-01-01

    The advanced optimization method named as adaptive range differential evolution (ARDE)is developed. The optimization performance of ARDE is demonstrated using a typical mathematical test and compared with the standard genetic algorithm and differential evolution. Combined with parallel ARDE, surface modeling method and Navier-Stokes solution, a new automatic aerodynamic optimization method is presented. A low aspect ratio transonic turbine stage is optimized for the maximization of the isentropic efficiency with forty-one design variables in total. The coarse-grained parallel strategy is applied to accelerate the design process using 15 CPUs. The isentropic efficiency of the optimum design is 1.6% higher than that of the reference design. The aerodynamic performance of the optimal design is much better than that of the reference design.

  19. Design and Optimization of a Turbine Intake Structure

    Directory of Open Access Journals (Sweden)

    P. Fošumpaur

    2005-01-01

    Full Text Available The appropriate design of the turbine intake structure of a hydropower plant is based on assumptions about its suitable function, and the design will increase the total efficiency of operation. This paper deals with optimal design of the turbine structure of run-of-river hydropower plants. The study focuses mainly on optimization of the hydropower plant location with respect to the original river banks, and on the optimal design of a separating pier between the weir and the power plant. The optimal design of the turbine intake was determined with the use of 2-D mathematical modelling. A case study is performed for the optimal design of a turbine intake structure on the Nemen river in Belarus. 

  20. Aeroelastic multidisciplinary design optimization of a swept wind turbine blade

    DEFF Research Database (Denmark)

    Pavese, Christian; Tibaldi, Carlo; Zahle, Frederik

    2017-01-01

    Mitigating loads on a wind turbine rotor can reduce the cost of energy. Sweeping blades produces a structural coupling between flapwise bending and torsion, which can be used for load alleviation purposes. A multidisciplinary design optimization (MDO) problem is formulated including the blade sweep...... against time-domain full design load basis aeroelastic simulations to ensure that they comply with the constraints. A 10-MW wind turbine blade is optimized by minimizing a cost function that includes mass and blade root flapwise fatigue loading. The design space is subjected to constraints that represent...... this achievement, a set of optimized straight blade designs is compared to a set of optimized swept blade designs. Relative to the respective optimized straight designs, the blade mass of the swept blades is reduced of an extra 2% to 3% and the blade root flapwise fatigue damage equivalent load by a further 8%....