WorldWideScience

Sample records for statistically-designed mixture experiment

  1. Statistically designed experiments to screen chemical mixtures for possible interactions

    NARCIS (Netherlands)

    Groten, J.P.; Tajima, O.; Feron, V.J.; Schoen, E.D.

    1998-01-01

    For the accurate analysis of possible interactive effects of chemicals in a defined mixture, statistical designs are necessary to develop clear and manageable experiments. For instance, factorial designs have been successfully used to detect two-factor interactions. Particularly useful for this

  2. A Statistical Approach to Optimizing Concrete Mixture Design

    OpenAIRE

    Ahmad, Shamsad; Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicate...

  3. A statistical approach to optimizing concrete mixture design.

    Science.gov (United States)

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  4. A Statistical Approach to Optimizing Concrete Mixture Design

    Directory of Open Access Journals (Sweden)

    Shamsad Ahmad

    2014-01-01

    Full Text Available A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33. A total of 27 concrete mixtures with three replicates (81 specimens were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48, cementitious materials content (350, 375, and 400 kg/m3, and fine/total aggregate ratio (0.35, 0.40, and 0.45. The experimental data were utilized to carry out analysis of variance (ANOVA and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  5. Experiments with Mixtures Designs, Models, and the Analysis of Mixture Data

    CERN Document Server

    Cornell, John A

    2011-01-01

    The most comprehensive, single-volume guide to conducting experiments with mixtures"If one is involved, or heavily interested, in experiments on mixtures of ingredients, one must obtain this book. It is, as was the first edition, the definitive work."-Short Book Reviews (Publication of the International Statistical Institute)"The text contains many examples with worked solutions and with its extensive coverage of the subject matter will prove invaluable to those in the industrial and educational sectors whose work involves the design and analysis of mixture experiments."-Journal of the Royal S

  6. Construction of a 21-Component Layered Mixture Experiment Design

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Jones, Bradley

    2004-01-01

    This paper describes the solution to a unique and challenging mixture experiment design problem involving: (1) 19 and 21 components for two different parts of the design, (2) many single-component and multi-component constraints, (3) augmentation of existing data, (4) a layered design developed in stages, and (5) a no-candidate-point optimal design approach. The problem involved studying the liquidus temperature of spinel crystals as a function of nuclear waste glass composition. The statistical objective was to develop an experimental design by augmenting existing glasses with new nonradioactive and radioactive glasses chosen to cover the designated nonradioactive and radioactive experimental regions. The existing 144 glasses were expressed as 19-component nonradioactive compositions and then augmented with 40 new nonradioactive glasses. These included 8 glasses on the outer layer of the region, 27 glasses on an inner layer, 2 replicate glasses at the centroid, and one replicate each of three existing glasses. Then, the 144 + 40 = 184 glasses were expressed as 21-component radioactive compositions and augmented with 5 radioactive glasses. A D-optimal design algorithm was used to select the new outer layer, inner layer, and radioactive glasses. Several statistical software packages can generate D-optimal experimental designs, but nearly all require a set of candidate points (e.g., vertices) from which to select design points. The large number of components (19 or 21) and many constraints made it impossible to generate the huge number of vertices and other typical candidate points. JMP(R) was used to select design points without candidate points. JMP uses a coordinate-exchange algorithm modified for mixture experiments, which is discussed in the paper

  7. Statistical experimental design for saltstone mixtures

    International Nuclear Information System (INIS)

    Harris, S.P.; Postles, R.L.

    1992-01-01

    The authors used a mixture experimental design for determining a window of operability for a process at the U.S. Department of Energy, Savannah River Site, Defense Waste Processing Facility (DWPF). The high-level radioactive waste at the Savannah River Site is stored in large underground carbon steel tanks. The waste consists of a supernate layer and a sludge layer. Cesium-137 will be removed from the supernate by precipitation and filtration. After further processing, the supernate layer will be fixed as a grout for disposal in concrete vaults. The remaining precipitate will be processed at the DWPF with treated waste tank sludge and glass-making chemicals into borosilicate glass. The leach-rate properties of the supernate grout formed from various mixes of solidified coefficients for NO 3 and chromium were used as a measure of leach rate. Various mixes of cement, Ca(OH) 2 , salt, slag, and fly ash were used. These constituents comprise the whole mix. Thus, a mixture experimental design was used. The regression procedure (PROC REG) in SAS was used to produce analysis of variance (ANOVA) statistics. In addition, detailed model diagnostics are readily available for identifying suspicious observations. For convenience, trillinear contour (TLC) plots, a standard graphics tool for examining mixture response surfaces, of the fitted model were produced using ECHIP

  8. Construction of a 21-Component Layered Mixture Experiment Design Using a New Mixture Coordinate-Exchange Algorithm

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Jones, Bradley

    2005-01-01

    This paper describes the solution to a unique and challenging mixture experiment design problem involving: (1) 19 and 21 components for two different parts of the design, (2) many single-component and multi-component constraints, (3) augmentation of existing data, (4) a layered design developed in stages, and (5) a no-candidate-point optimal design approach. The problem involved studying the liquidus temperature of spinel crystals as a function of nuclear waste glass composition. The statistical objective was to develop an experimental design by augmenting existing glasses with new nonradioactive and radioactive glasses chosen to cover the designated nonradioactive and radioactive experimental regions. The existing 144 glasses were expressed as 19-component nonradioactive compositions and then augmented with 40 new nonradioactive glasses. These included 8 glasses on the outer layer of the region, 27 glasses on an inner layer, 2 replicate glasses at the centroid, and one replicate each of three existing glasses. Then, the 144 + 40 = 184 glasses were expressed as 21-component radioactive compositions, and augmented with 5 radioactive glasses. A D-optimal design algorithm was used to select the new outer layer, inner layer, and radioactive glasses. Several statistical software packages can generate D-optimal experimental designs, but nearly all of them require a set of candidate points (e.g., vertices) from which to select design points. The large number of components (19 or 21) and many constraints made it impossible to generate the huge number of vertices and other typical candidate points. JMP was used to select design points without candidate points. JMP uses a coordinate-exchange algorithm modified for mixture experiments, which is discussed and illustrated in the paper

  9. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  10. Mixture design: A review of recent applications in the food industry

    OpenAIRE

    Yeliz Buruk Şahin; Ezgi Aktar Demirtaş; Nimetullah Burnak

    2016-01-01

    Design of experiments (DOE) is a systematic approach to applying statistical methods to the experimental process. The main purpose of this study is to provide useful insights into mixture design as a special type of DOE and to present a review of current mixture design applications in the food industry. The theoretical principles of mixture design and its application in the food industry, based on an extensive review of the literature, are described. Mixture design types, such as simplex-latt...

  11. Statistical experimental design for saltstone mixtures

    International Nuclear Information System (INIS)

    Harris, S.P.; Postles, R.L.

    1991-01-01

    We used a mixture experimental design for determining a window of operability for a process at the Savannah River Site Defense Waste Processing Facility (DWPF). The high-level radioactive waste at the Savannah River Site is stored in large underground carbon steel tanks. The waste consists of a supernate layer and a sludge layer. 137 Cs will be removed from the supernate by precipitation and filtration. After further processing, the supernate layer will be fixed as a grout for disposal in concrete vaults. The remaining precipitate will be processed at the DWPF with treated waste tank sludge and glass-making chemicals into borosilicate glass. The leach rate properties of the supernate grout, formed from various mixes of solidified salt waste, needed to be determined. The effective diffusion coefficients for NO 3 and Cr were used as a measure of leach rate. Various mixes of cement, Ca(OH) 2 , salt, slag and flyash were used. These constituents comprise the whole mix. Thus, a mixture experimental design was used

  12. I-optimal mixture designs

    OpenAIRE

    GOOS, Peter; JONES, Bradley; SYAFITRI, Utami

    2013-01-01

    In mixture experiments, the factors under study are proportions of the ingredients of a mixture. The special nature of the factors in a mixture experiment necessitates specific types of regression models, and specific types of experimental designs. Although mixture experiments usually are intended to predict the response(s) for all possible formulations of the mixture and to identify optimal proportions for each of the ingredients, little research has been done concerning their I-optimal desi...

  13. Statistical mixture design selective extraction of compounds with antioxidant activity and total polyphenol content from Trichilia catigua.

    Science.gov (United States)

    Lonni, Audrey Alesandra Stinghen Garcia; Longhini, Renata; Lopes, Gisely Cristiny; de Mello, João Carlos Palazzo; Scarminio, Ieda Spacino

    2012-03-16

    Statistical design mixtures of water, methanol, acetone and ethanol were used to extract material from Trichilia catigua (Meliaceae) barks to study the effects of different solvents and their mixtures on its yield, total polyphenol content and antioxidant activity. The experimental results and their response surface models showed that quaternary mixtures with approximately equal proportions of all four solvents provided the highest yields, total polyphenol contents and antioxidant activities of the crude extracts followed by ternary design mixtures. Principal component and hierarchical clustering analysis of the HPLC-DAD spectra of the chromatographic peaks of 1:1:1:1 water-methanol-acetone-ethanol mixture extracts indicate the presence of cinchonains, gallic acid derivatives, natural polyphenols, flavanoids, catechins, and epicatechins. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  15. D- and I-optimal design of mixture experiments in the presence of ingredient availability constraints

    OpenAIRE

    SYAFITRI, Utami; SARTONO, Bagus; GOOS, Peter

    2015-01-01

    Mixture experiments usually involve various constraints on the proportions of the ingredients of the mixture under study. In this paper, inspired by the fact that the available stock of certain ingredients is often limited, we focus on a new type of constraint, which we refer to as an ingredient availability constraint. This type of constraint substantially complicates the search for optimal designs for mixture experiments. One difficulty, for instance, is that the optimal number of experimen...

  16. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    Science.gov (United States)

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  17. Optimal mixture experiments

    CERN Document Server

    Sinha, B K; Pal, Manisha; Das, P

    2014-01-01

    The book dwells mainly on the optimality aspects of mixture designs. As mixture models are a special case of regression models, a general discussion on regression designs has been presented, which includes topics like continuous designs, de la Garza phenomenon, Loewner order domination, Equivalence theorems for different optimality criteria and standard optimality results for single variable polynomial regression and multivariate linear and quadratic regression models. This is followed by a review of the available literature on estimation of parameters in mixture models. Based on recent research findings, the volume also introduces optimal mixture designs for estimation of optimum mixing proportions in different mixture models, which include Scheffé’s quadratic model, Darroch-Waller model, log- contrast model, mixture-amount models, random coefficient models and multi-response model.  Robust mixture designs and mixture designs in blocks have been also reviewed. Moreover, some applications of mixture desig...

  18. Organic biowastes blend selection for composting industrial eggshell by-product: experimental and statistical mixture design.

    Science.gov (United States)

    Soares, Micaela A R; Andrade, Sandra R; Martins, Rui C; Quina, Margarida J; Quinta-Ferreira, Rosa M

    2012-01-01

    Composting is one of the technologies recommended for pre-treating industrial eggshells (ES) before its application in soils, for calcium recycling. However, due to the high inorganic content of ES, a mixture of biodegradable materials is required to assure a successful procedure. In this study, an adequate organic blend composition containing potato peel (PP), grass clippings (GC) and wheat straw (WS) was determined by applying the simplex-centroid mixture design method to achieve a desired moisture content, carbon: nitrogen ratio and free air space for effective composting of ES. A blend of 56% PP, 37% GC and 7% WS was selected and tested in a self heating reactor, where 10% (w/w) of ES was incorporated. After 29 days of reactor operation, a dry matter reduction of 46% was achieved and thermophilic temperatures were maintained during 15 days, indicating that the blend selected by statistical approach was adequate for composting of ES.

  19. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  20. Mixture-mixture design for the fingerprint optimization of chromatographic mobile phases and extraction solutions for Camellia sinensis.

    Science.gov (United States)

    Borges, Cleber N; Bruns, Roy E; Almeida, Aline A; Scarminio, Ieda S

    2007-07-09

    A composite simplex centroid-simplex centroid mixture design is proposed for simultaneously optimizing two mixture systems. The complementary model is formed by multiplying special cubic models for the two systems. The design was applied to the simultaneous optimization of both mobile phase chromatographic mixtures and extraction mixtures for the Camellia sinensis Chinese tea plant. The extraction mixtures investigated contained varying proportions of ethyl acetate, ethanol and dichloromethane while the mobile phase was made up of varying proportions of methanol, acetonitrile and a methanol-acetonitrile-water (MAW) 15%:15%:70% mixture. The experiments were block randomized corresponding to a split-plot error structure to minimize laboratory work and reduce environmental impact. Coefficients of an initial saturated model were obtained using Scheffe-type equations. A cumulative probability graph was used to determine an approximate reduced model. The split-plot error structure was then introduced into the reduced model by applying generalized least square equations with variance components calculated using the restricted maximum likelihood approach. A model was developed to calculate the number of peaks observed with the chromatographic detector at 210 nm. A 20-term model contained essentially all the statistical information of the initial model and had a root mean square calibration error of 1.38. The model was used to predict the number of peaks eluted in chromatograms obtained from extraction solutions that correspond to axial points of the simplex centroid design. The significant model coefficients are interpreted in terms of interacting linear, quadratic and cubic effects of the mobile phase and extraction solution components.

  1. Statistical analysis of joint toxicity in biological growth experiments

    DEFF Research Database (Denmark)

    Spliid, Henrik; Tørslev, J.

    1994-01-01

    The authors formulate a model for the analysis of designed biological growth experiments where a mixture of toxicants is applied to biological target organisms. The purpose of such experiments is to assess the toxicity of the mixture in comparison with the toxicity observed when the toxicants are...... is applied on data from an experiment where inhibition of the growth of the bacteria Pseudomonas fluorescens caused by different mixtures of pentachlorophenol and aniline was studied.......The authors formulate a model for the analysis of designed biological growth experiments where a mixture of toxicants is applied to biological target organisms. The purpose of such experiments is to assess the toxicity of the mixture in comparison with the toxicity observed when the toxicants...

  2. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  3. Quantum statistics and liquid helium 3 - helum 4 mixtures

    International Nuclear Information System (INIS)

    Cohen, E.G.D.

    1979-01-01

    The behaviour of liquid helium 3-helium 4 mixtures is considered from the point of view of manifestation of quantum statistics effects in macrophysics. The Boze=Einstein statistics is shown to be of great importance for understanding superfluid helium-4 properties whereas the Fermi-Dirac statistics is of importance for understanding helium-3 properties. Without taking into consideration the interaction between the helium atoms it is impossible to understand the basic properties of liquid helium 33 - helium 4 mixtures at constant pressure. Proposed is a simple model of the liquid helium 3-helium 4 mixture, namely the binary mixture consisting of solid spheres of two types subjecting to the Fermi-Dirac and Bose-Einstein statistics relatively. This model predicts correctly the most surprising peculiarities of phase diagrams of concentration dependence on temperature for helium solutions. In particular, the helium 4 Bose-Einstein statistics is responsible for the phase lamination of helium solutions at low temperatures. It starts in the peculiar critical point. The helium 4 Fermi-Dirac statistics results in incomplete phase lamination close to the absolute zero temperatures, that permits operation of a powerful cooling facility, namely refrigerating machine on helium solution

  4. Portfolio optimization using Mixture Design of Experiments. Scheduling trades within electricity markets

    International Nuclear Information System (INIS)

    Oliveira, Francisco Alexandre de; Paiva, Anderson Paulo de; Lima, Jose Wanderley Marangon; Balestrassi, Pedro Paulo; Mendes, Rona Rinston Amaury

    2011-01-01

    Deregulation of the electricity sector has given rise to several approaches to defining optimal portfolios of energy contracts. Financial tools - requiring substantial adjustments - are usually used to determine risk and return. This article presents a novel approach to adjusting the conditional value at risk (CVaR) metric to the mix of contracts on the energy markets; the approach uses Mixture Design of Experiments (MDE). In this kind of experimental strategy, the design factors are treated as proportions in a mixture system considered quite adequate for treating portfolios in general. Instead of using traditional linear programming, the concept of desirability function is here used to combine the multi-response, nonlinear objective functions for mean with the variance of a specific portfolio obtained through MDE. The maximization of the desirability function is implied in the portfolio optimization, generating an efficient recruitment frontier. This approach offers three main contributions: it includes risk aversion in the optimization routine, it assesses interaction between contracts, and it lessens the computational effort required to solve the constrained nonlinear optimization problem. A case study based on the Brazilian energy market is used to illustrate the proposal. The numerical results verify the proposal's adequacy. (author)

  5. Portfolio optimization using Mixture Design of Experiments. Scheduling trades within electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Francisco Alexandre de; Paiva, Anderson Paulo de; Lima, Jose Wanderley Marangon; Balestrassi, Pedro Paulo; Mendes, Rona Rinston Amaury [Federal Univ. of Itajuba, Minas Gerais (Brazil)

    2011-01-15

    Deregulation of the electricity sector has given rise to several approaches to defining optimal portfolios of energy contracts. Financial tools - requiring substantial adjustments - are usually used to determine risk and return. This article presents a novel approach to adjusting the conditional value at risk (CVaR) metric to the mix of contracts on the energy markets; the approach uses Mixture Design of Experiments (MDE). In this kind of experimental strategy, the design factors are treated as proportions in a mixture system considered quite adequate for treating portfolios in general. Instead of using traditional linear programming, the concept of desirability function is here used to combine the multi-response, nonlinear objective functions for mean with the variance of a specific portfolio obtained through MDE. The maximization of the desirability function is implied in the portfolio optimization, generating an efficient recruitment frontier. This approach offers three main contributions: it includes risk aversion in the optimization routine, it assesses interaction between contracts, and it lessens the computational effort required to solve the constrained nonlinear optimization problem. A case study based on the Brazilian energy market is used to illustrate the proposal. The numerical results verify the proposal's adequacy. (author)

  6. Design Method for Proportion of Cement-Foamed Asphalt Cold Recycled Mixture

    Directory of Open Access Journals (Sweden)

    Li Junxiao

    2018-01-01

    Full Text Available Through foaming experiment of Zhongtai AH-70 asphalt, the best foaming temperature water consumption and influence factors of foamed asphalt’s foaming features are determined; By designing the proportion of foamed asphalt cold in-place recycled mixture combined with the water stability experiment, for this mixture the best foamed asphalt addition is 3%, and proportion of the mixture is RAP: fine aggregate: cement=75:23:2. Using SEM technology, the mechanism of increasing on the intensity of foamed asphalt mixture resulted by the addition of cement was analysed. This research provides reference for cement admixture’s formulation in the designing of foamed asphalt cold in-place recycled mixture.

  7. Statistical aspects of quantitative real-time PCR experiment design.

    Science.gov (United States)

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    Science.gov (United States)

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative

  9. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    Science.gov (United States)

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  10. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    Science.gov (United States)

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development. Published 2012. This article is a US Government work and is in the public domain in the USA.

  11. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  12. Effect of Fibers on Mixture Design of Stone Matrix Asphalt

    Directory of Open Access Journals (Sweden)

    Yanping Sheng

    2017-03-01

    Full Text Available Lignin fibers typically influence the mixture performance of stone matrix asphalt (SMA, such as strength, stability, durability, noise level, rutting resistance, fatigue life, and water sensitivity. However, limited studies were conducted to analyze the influence of fibers on the percent voids in mineral aggregate in bituminous mixture (VMA during the mixture design. This study analyzed the effect of different fibers and fiber contents on the VMA in SMA mixture design. A surface-dry condition method test and Marshall Stability test were applied on the SMA mixture with four different fibers (i.e., flocculent lignin fiber, mineral fiber, polyester fiber, blended fiber. The test results indicated that the bulk specific gravity of SMA mixtures and asphalt saturation decreased with the increasing fiber content, whilst the percent air voids in bituminous mixtures (VV, Marshall Stability and VMA increased. Mineral fiber had the most obvious impact on the bulk specific gravity of bituminous mixtures, while flocculent lignin fiber had a minimal impact. The mixture with mineral fiber and polyester fiber had significant effects on the volumetric properties, and, consequently, exhibited better VMA over the conventional SMA mixture with lignin fiber. Modified fiber content range was also provided, which will widen the utilization of mineral fiber and polyester fiber in the applications of SMA mixtures. The mixture evaluation suggested no statistically significant difference between lignin fiber and polyester fiber on the stability. The mineral fiber required a much larger fiber content to improve the mixture performance than other fibers. Overall, the results can be a reference to guide SMA mixture design.

  13. A Two-Stage Layered Mixture Experiment Design for a Nuclear Waste Glass Application-Part 2

    International Nuclear Information System (INIS)

    Cooley, Scott K.; Piepel, Gregory F.; Gan, Hao; Kot, Wing; Pegg, Ian L.

    2003-01-01

    Part 1 (Cooley and Piepel, 2003a) describes the first stage of a two-stage experimental design to support property-composition modeling for high-level waste (HLW) glass to be produced at the Hanford Site in Washington state. Each stage used a layered design having an outer layer, an inner layer, a center point, and some replicates. However, the design variables and constraints defining the layers of the experimental glass composition region (EGCR) were defined differently for the second stage than for the first. The first-stage initial design involved 15 components, all treated as mixture variables. The second-stage augmentation design involved 19 components, with 14 treated as mixture variables and 5 treated as non-mixture variables. For each second-stage layer, vertices were generated and optimal design software was used to select alternative subsets of vertices for the design and calculate design optimality measures. A model containing 29 partial quadratic mixture terms plus 5 linear terms for the non-mixture variables was the basis for the optimal design calculations. Predicted property values were plotted for the alternative subsets of second-stage vertices and the first-stage design points. Based on the optimality measures and the predicted property distributions, a ''best'' subset of vertices was selected for each layer of the second-stage to augment the first-stage design

  14. Statistical Description of Segregation in a Powder Mixture

    DEFF Research Database (Denmark)

    Chapiro, Alexander; Stenby, Erling Halfdan

    1996-01-01

    In this paper we apply the statistical mechanics of powders to describe a segregated state in a mixture of grains of different sizes. Variation of the density of a packing with depth arising due to changes of particle configurations is studied. The statistical mechanics of powders is generalized...

  15. Design Method for Proportion of Cement-Foamed Asphalt Cold Recycled Mixture

    OpenAIRE

    Li Junxiao; Fu Wei; Zang Hechao

    2018-01-01

    Through foaming experiment of Zhongtai AH-70 asphalt, the best foaming temperature water consumption and influence factors of foamed asphalt’s foaming features are determined; By designing the proportion of foamed asphalt cold in-place recycled mixture combined with the water stability experiment, for this mixture the best foamed asphalt addition is 3%, and proportion of the mixture is RAP: fine aggregate: cement=75:23:2. Using SEM technology, the mechanism of increasing on the intensity of f...

  16. A new efficient mixture screening design for optimization of media.

    Science.gov (United States)

    Rispoli, Fred; Shah, Vishal

    2009-01-01

    Screening ingredients for the optimization of media is an important first step to reduce the many potential ingredients down to the vital few components. In this study, we propose a new method of screening for mixture experiments called the centroid screening design. Comparison of the proposed design with Plackett-Burman, fractional factorial, simplex lattice design, and modified mixture design shows that the centroid screening design is the most efficient of all the designs in terms of the small number of experimental runs needed and for detecting high-order interaction among ingredients. (c) 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009.

  17. A new decomposition-based computer-aided molecular/mixture design methodology for the design of optimal solvents and solvent mixtures

    DEFF Research Database (Denmark)

    Karunanithi, A.T.; Achenie, L.E.K.; Gani, Rafiqul

    2005-01-01

    This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective is to be optim......This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective...... is to be optimized subject to structural, property, and process constraints. The general molecular/mixture design problem is divided into two parts. For optimal single-compound design, the first part is solved. For mixture design, the single-compound design is first carried out to identify candidates...... and then the second part is solved to determine the optimal mixture. The decomposition of the CAMD MINLP model into relatively easy to solve subproblems is essentially a partitioning of the constraints from the original set. This approach is illustrated through two case studies. The first case study involves...

  18. Component effects in mixture experiments

    International Nuclear Information System (INIS)

    Piepel, G.F.

    1980-01-01

    In a mixture experiment, the response to a mixture of q components is a function of the proportions x 1 , x 2 , ..., x/sub q/ of components in the mixture. Experimental regions for mixture experiments are often defined by constraints on the proportions of the components forming the mixture. The usual (orthogonal direction) definition of a factor effect does not apply because of the dependence imposed by the mixture restriction, /sup q/Σ/sub i=1/ x/sub i/ = 1. A direction within the experimental region in which to compute a mixture component effect is presented and compared to previously suggested directions. This new direction has none of the inadequacies or errors of previous suggestions while having a more meaningful interpretation. The distinction between partial and total effects is made. The uses of partial and total effects (computed using the new direction) in modification and interpretation of mixture response prediction equations are considered. The suggestions of the paper are illustrated in an example from a glass development study in a waste vitrification program. 5 figures, 3 tables

  19. Mixture Design and Its Application in Cement Solidification for Spent Resin

    International Nuclear Information System (INIS)

    Gan, Xueying; Lin, Meiqing; Chen, Hui

    1994-01-01

    The study is aimed to assess the usefulness of the mixture design for spent resin immobilization in cement. Although a considerable amount of research has been carried out to determine the limits for the composition of an acceptable resin-cement mixture, no efficient experimental strategy exists that explores the full properties of waste form against composition relationship. In order to gain an overall view, this report introduces the method of mixture design and mixture analysis, and describes the design of experiment of the 5-component mixture with the constraint conditions. The mathematic models of 28-day compressive strength varying with the ingredients are fitted, and the main effect and interaction effect of two ingredients are identified quantitatively along with the graphical interpretation using the response trace plot and contour plots

  20. Mixture experiment methods in the development and optimization of microemulsion formulations.

    Science.gov (United States)

    Furlanetto, S; Cirri, M; Piepel, G; Mennini, N; Mura, P

    2011-06-25

    Microemulsion formulations represent an interesting delivery vehicle for lipophilic drugs, allowing for improving their solubility and dissolution properties. This work developed effective microemulsion formulations using glyburide (a very poorly-water-soluble hypoglycaemic agent) as a model drug. First, the area of stable microemulsion (ME) formations was identified using a new approach based on mixture experiment methods. A 13-run mixture design was carried out in an experimental region defined by constraints on three components: aqueous, oil and surfactant/cosurfactant. The transmittance percentage (at 550 nm) of ME formulations (indicative of their transparency and thus of their stability) was chosen as the response variable. The results obtained using the mixture experiment approach corresponded well with those obtained using the traditional approach based on pseudo-ternary phase diagrams. However, the mixture experiment approach required far less experimental effort than the traditional approach. A subsequent 13-run mixture experiment, in the region of stable MEs, was then performed to identify the optimal formulation (i.e., having the best glyburide dissolution properties). Percent drug dissolved and dissolution efficiency were selected as the responses to be maximized. The ME formulation optimized via the mixture experiment approach consisted of 78% surfactant/cosurfacant (a mixture of Tween 20 and Transcutol, 1:1, v/v), 5% oil (Labrafac Hydro) and 17% aqueous phase (water). The stable region of MEs was identified using mixture experiment methods for the first time. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  2. Improvement on sugar cane bagasse hydrolysis using enzymatic mixture designed cocktail.

    Science.gov (United States)

    Bussamra, Bianca Consorti; Freitas, Sindelia; Costa, Aline Carvalho da

    2015-01-01

    The aim of this work was to study cocktail supplementation for sugar cane bagasse hydrolysis, where the enzymes were provided from both commercial source and microorganism cultivation (Trichoderma reesei and genetically modified Escherichia coli), followed by purification. Experimental simplex lattice mixture design was performed to optimize the enzymatic proportion. The response was evaluated through hydrolysis microassays validated here. The optimized enzyme mixture, comprised of T. reesei fraction (80%), endoglucanase (10%) and β-glucosidase (10%), converted, theoretically, 72% of cellulose present in hydrothermally pretreated bagasse, whereas commercial Celluclast 1.5L converts 49.11%±0.49. Thus, a rational enzyme mixture designed by using synergism concept and statistical analysis was capable of improving biomass saccharification. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Using mixture experiments to develop cementitious waste forms

    International Nuclear Information System (INIS)

    Spence, R.D.; Anderson, C.M.; Piepel, G.F.

    1993-01-01

    Mixture experiments are presented as a means to develop cementitious waste forms. The steps of a mixture experiment are (1) identifying the waste form ingredients; (2) determining the compositional constraints of these ingredients; (3) determining the extreme vertices, edge midpoints, and face centroids of the constrained multidimensional volume (these points along with some interior points represent the set of possible compositions for testing); (4) picking a subset of these points for the experimental design; (5) measuring the properties of the selected subset; and (6) generating the response surface models. The models provide a means for predicting the properties within the constrained region. This article presents an example of this process for one property: unconfined compressive strength

  4. Optimization and characterization of liposome formulation by mixture design.

    Science.gov (United States)

    Maherani, Behnoush; Arab-tehrany, Elmira; Kheirolomoom, Azadeh; Reshetov, Vadzim; Stebe, Marie José; Linder, Michel

    2012-02-07

    This study presents the application of the mixture design technique to develop an optimal liposome formulation by using the different lipids in type and percentage (DOPC, POPC and DPPC) in liposome composition. Ten lipid mixtures were generated by the simplex-centroid design technique and liposomes were prepared by the extrusion method. Liposomes were characterized with respect to size, phase transition temperature, ζ-potential, lamellarity, fluidity and efficiency in loading calcein. The results were then applied to estimate the coefficients of mixture design model and to find the optimal lipid composition with improved entrapment efficiency, size, transition temperature, fluidity and ζ-potential of liposomes. The response optimization of experiments was the liposome formulation with DOPC: 46%, POPC: 12% and DPPC: 42%. The optimal liposome formulation had an average diameter of 127.5 nm, a phase-transition temperature of 11.43 °C, a ζ-potential of -7.24 mV, fluidity (1/P)(TMA-DPH)((¬)) value of 2.87 and an encapsulation efficiency of 20.24%. The experimental results of characterization of optimal liposome formulation were in good agreement with those predicted by the mixture design technique.

  5. New Flexible Models and Design Construction Algorithms for Mixtures and Binary Dependent Variables

    NARCIS (Netherlands)

    A. Ruseckaite (Aiste)

    2017-01-01

    markdownabstractThis thesis discusses new mixture(-amount) models, choice models and the optimal design of experiments. Two chapters of the thesis relate to the so-called mixture, which is a product or service whose ingredients’ proportions sum to one. The thesis begins by introducing mixture

  6. Use of Mixture Designs to Investigate Contribution of Minor Sex Pheromone Components to Trap Catch of the Carpenterworm Moth, Chilecomadia valdiviana.

    Science.gov (United States)

    Lapointe, Stephen L; Barros-Parada, Wilson; Fuentes-Contreras, Eduardo; Herrera, Heidy; Kinsho, Takeshi; Miyake, Yuki; Niedz, Randall P; Bergmann, Jan

    2017-12-01

    Field experiments were carried out to study responses of male moths of the carpenterworm, Chilecomadia valdiviana (Lepidoptera: Cossidae), a pest of tree and fruit crops in Chile, to five compounds previously identified from the pheromone glands of females. Previously, attraction of males to the major component, (7Z,10Z)-7,10-hexadecadienal, was clearly demonstrated while the role of the minor components was uncertain due to the use of an experimental design that left large portions of the design space unexplored. We used mixture designs to study the potential contributions to trap catch of the four minor pheromone components produced by C. valdiviana. After systematically exploring the design space described by the five pheromone components, we concluded that the major pheromone component alone is responsible for attraction of male moths in this species. The need for appropriate experimental designs to address the problem of assessing responses to mixtures of semiochemicals in chemical ecology is described. We present an analysis of mixture designs and response surface modeling and an explanation of why this approach is superior to commonly used, but statistically inappropriate, designs.

  7. A Two-Stage Layered Mixture Experiment Design for a Nuclear Waste Glass Application-Part 1

    International Nuclear Information System (INIS)

    Cooley, Scott K.; Piepel, Gregory F.; Gan, Hao; Kot, Wing; Pegg, Ian L.

    2003-01-01

    A layered experimental design involving mixture variables was generated to support developing property-composition models for high-level waste (HLW) glasses. The design was generated in two stages, each having unique characteristics. Each stage used a layered design having an outer layer, an inner layer, a center point, and some replicates. The layers were defined by single- and multi-variable constraints. The first stage involved 15 glass components treated as mixture variables. For each layer, vertices were generated and optimal design software was used to select alternative subsets of vertices and calculate design optimality measures. Two partial quadratic mixture models, containing 25 terms for the outer layer and 30 terms for the inner layer, were the basis for the optimal design calculations. Distributions of predicted glass property values were plotted and evaluated for the alternative subsets of vertices. Based on the optimality measures and the predicted property distributions, a ''best'' subset of vertices was selected for each layer to form a layered design for the first stage. The design for the second stage was selected to augment the first-stage design. The discussion of the second-stage design begins in this Part 1 and is continued in Part 2 (Cooley and Piepel, 2003b)

  8. Stiffness modulus of Polyethylene Terephthalate modified asphalt mixture: A statistical analysis of the laboratory testing results

    International Nuclear Information System (INIS)

    Baghaee Moghaddam, Taher; Soltani, Mehrtash; Karim, Mohamed Rehan

    2015-01-01

    Highlights: • Effect of PET modification on stiffness property of asphalt mixture was examined. • Different temperatures and loading amounts were designated. • Statistical analysis was used to find interactions between selected variables. • A good agreement between experimental results and predicted values was obtained. • Optimal amount of PET was calculated to achieve the highest mixture performance. - Abstract: Stiffness of asphalt mixture is a fundamental design parameter of flexible pavement. According to literature, stiffness value is very susceptible to environmental and loading conditions. In this paper, effects of applied stress and temperature on the stiffness modulus of unmodified and Polyethylene Terephthalate (PET) modified asphalt mixtures were evaluated using Response Surface Methodology (RSM). A quadratic model was successfully fitted to the experimental data. Based on the results achieved in this study, the temperature variation had the highest impact on the mixture’s stiffness. Besides, PET content and amount of stress showed to have almost the same effect on the stiffness of mixtures. The optimal amount of PET was found to be 0.41% by weight of aggregate particles to reach the highest stiffness value

  9. Statistical experimental design for refractory coatings

    International Nuclear Information System (INIS)

    McKinnon, J.A.; Standard, O.C.

    2000-01-01

    The production of refractory coatings on metal casting moulds is critically dependent on the development of suitable rheological characteristics, such as viscosity and thixotropy, in the initial coating slurry. In this paper, the basic concepts of mixture design and analysis are applied to the formulation of a refractory coating, with illustration by a worked example. Experimental data of coating viscosity versus composition are fitted to a statistical model to obtain a reliable method of predicting the optimal formulation of the coating. Copyright (2000) The Australian Ceramic Society

  10. Design of modern experiments

    International Nuclear Information System (INIS)

    Park, Sung Hweon

    1984-03-01

    This book is for researchers and engineers, which is written to focus on practical design of experiments. It gives descriptions of conception of design of experiments, basic statistics theory, one way design of experiment, two-way layout without repetition, two-way layout with repetition, partition, a correlation analysis and regression analysis, latin squares, factorial design, design of experiment by table of orthogonal arrays, design of experiment of response surface, design of experiment on compound, Evop, and design of experiment of taguchi.

  11. Laser induced breakdown in gas mixtures. Experimental and statistical investigation on n-decane ignition: Pressure, mixture composition and equivalence ratio effects.

    Science.gov (United States)

    Mokrani, Nabil; Gillard, Philippe

    2018-03-26

    This paper presents a physical and statistical approach to laser-induced breakdown in n-decane/N 2  + O 2 mixtures as a function of incident or absorbed energy. A parametric study, with pressure, fuel purity and equivalence ratio, was conducted to determine the incident and absorbed energies involved in producing breakdown, followed or not by ignition. The experiments were performed using a Q-switched Nd-YAG laser (1064 nm) inside a cylindrical 1-l combustion chamber in the range of 1-100 mJ of incident energy. A stochastic study of breakdown and ignition probabilities showed that the mixture composition had a significant effect on ignition with large variation of incident or absorbed energy required to obtain 50% of breakdown. It was observed that the combustion products absorb more energy coming from the laser. The effect of pressure on the ignition probabilities of lean and near stoichiometric mixtures was also investigated. It was found that a high ignition energy E50% is required for lean mixtures at high pressures (3 bar). The present study provides new data obtained on an original experimental setup and the results, close to laboratory-produced laser ignition phenomena, will enhance the understanding of initial conditions on the breakdown or ignition probabilities for different mixtures. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Design of Experiments: Optimizing the Polycarboxylation/Functionalization of Tungsten Disulfide Nanotubes

    Directory of Open Access Journals (Sweden)

    Daniel Raichman

    2014-08-01

    Full Text Available Design of experiments (DOE methodology was used to identify and optimize factors that influence the degree of functionalization (polycarboxylation of WS2 INTs via a modified acidic Vilsmeier–Haack reagent. The six factors investigated were reaction time, temperature and the concentrations of 2-bromoacetic acid, WS2 INTs, silver acetate and DMF. The significance of each factor and the associated interactive effects were evaluated using a two-level factorial statistical design in conjunction with statistical software (MiniTab® 16 based on quadratic programming. Although statistical analysis indicated that no factors were statistically significant, time, temperature and concentration of silver acetate were found to be the most important contributors to obtaining maximum functionalization/carboxylation. By examining contour plots and interaction plots, it was determined that optimal functionalization is obtained in a temperature range of 115–120 °C with a reaction time of 54 h using a mixture of 6 mL DMF, 200 mg INTs, 800 mg 2-bromoacetic acid and 60 mg silver acetate.

  13. Experimental toxicology: Issues of statistics, experimental design, and replication.

    Science.gov (United States)

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. SSSFD manipulator engineering using statistical experiment design techniques

    Science.gov (United States)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  15. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...

  16. Mixture design procedure for flexible base.

    Science.gov (United States)

    2013-04-01

    This document provides information on mixture design requirements for a flexible base course. Sections : design requirements, job mix formula, contractor's responsibility, and engineer's responsibility. Tables : material requirements; requirements fo...

  17. Using mixture design of experiments to assess the environmental impact of clay-based structural ceramics containing foundry wastes

    Energy Technology Data Exchange (ETDEWEB)

    Coronado, M. [Department of Chemistry and Process and Resources Engineering, University of Cantabria, 39005 Santander (Spain); Department of Materials and Ceramics Engineering (CICECO), University of Aveiro, 3810-193 Aveiro (Portugal); Segadães, A.M. [Department of Materials and Ceramics Engineering (CICECO), University of Aveiro, 3810-193 Aveiro (Portugal); Andrés, A., E-mail: andresa@unican.es [Department of Chemistry and Process and Resources Engineering, University of Cantabria, 39005 Santander (Spain)

    2015-12-15

    Highlights: • Modelling of the environmental risk in terms of clay and by-products contents. • M-DoE and response surface plots enable quick comparison of three ceramic processes. • Basicity of the mixture increases the leaching, especially at low firing temperatures. • Liquid phase content plays a major role decreasing the leaching of Cr and Mo. • Together, M-DoE and phase diagrams enable better prediction of pollutants leaching. - Abstract: This work describes the leaching behavior of potentially hazardous metals from three different clay-based industrial ceramic products (wall bricks, roof tiles, and face bricks) containing foundry sand dust and Waelz slag as alternative raw materials. For each product, ten mixtures were defined by mixture design of experiments and the leaching of As, Ba, Cd, Cr, Cu, Mo, Ni, Pb, and Zn was evaluated in pressed specimens fired simulating the three industrial ceramic processes. The results showed that, despite the chemical, mineralogical and processing differences, only chrome and molybdenum were not fully immobilized during ceramic processing. Their leaching was modeled as polynomial equations, functions of the raw materials contents, and plotted as response surfaces. This brought to evidence that Cr and Mo leaching from the fired products is not only dependent on the corresponding contents and the basicity of the initial mixtures, but is also clearly related with the mineralogical composition of the fired products, namely the amount of the glassy phase, which depends on both the major oxides contents and the firing temperature.

  18. Design and analysis of experiments with SAS

    CERN Document Server

    Lawson, John

    2010-01-01

    IntroductionStatistics and Data Collection Beginnings of Statistically Planned Experiments Definitions and Preliminaries Purposes of Experimental Design Types of Experimental Designs Planning Experiments Performing the Experiments Use of SAS SoftwareCompletely Randomized Designs with One Factor Introduction Replication and Randomization A Historical Example Linear Model for Completely Randomized Design (CRD) Verifying Assumptions of the Linear Model Analysis Strategies When Assumptions Are Violated Determining the Number of Replicates Comparison of Treatments after the F-TestFactorial Designs

  19. Design of modern experiments(revised version)

    International Nuclear Information System (INIS)

    Park, Sung Hweon

    1984-03-01

    This book mentions design of modern experiments. It includes conception of design of experiments, a key statistics theory, one way design of experiment, two-way layout without repetition and with repetition, multi layout and analysis of enumerated data, partition, correlation and regression analysis, latin squares, factorial design, design of experiment by table of orthogonal arrays I, II, incomplete block design, design of response surface, design of compound experiment, Evop and steepest ascent or descent method and design of experiment of taguchi.

  20. Statistical mechanics of binary mixture adsorption in metal-organic frameworks in the osmotic ensemble

    Science.gov (United States)

    Dunne, Lawrence J.; Manos, George

    2018-03-01

    Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO2 and CH4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO2 and CH4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes. This article is part of the theme issue `Modern theoretical chemistry'.

  1. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  2. Quantum-statistical mechanics of an atom-dimer mixture: Lee-Yang cluster expansion approach

    International Nuclear Information System (INIS)

    Ohkuma, Takahiro; Ueda, Masahito

    2006-01-01

    We use the Lee-Yang cluster expansion method to study quantum-statistical properties of a mixture of interconvertible atoms and dimers, where the dimers form in a two-body bound state of the atoms. We point out an infinite series of cluster diagrams whose summation leads to the Bose-Einstein condensation of the dimers below a critical temperature. Our theory captures some important features of a cold atom-dimer mixture such as interconversion of atoms and dimers and properties of the mixture at the unitarity limit

  3. Optimization of fruit punch using mixture design.

    Science.gov (United States)

    Kumar, S Bharath; Ravi, R; Saraswathi, G

    2010-01-01

    A highly acceptable dehydrated fruit punch was developed with selected fruits, namely lemon, orange, and mango, using a mixture design and optimization technique. The fruit juices were freeze dried, powdered, and used in the reconstitution studies. Fruit punches were prepared according to the experimental design combinations (total 10) based on a mixture design and then subjected to sensory evaluation for acceptability. Response surfaces of sensory attributes were also generated as a function of fruit juices. Analysis of data revealed that the fruit punch prepared using 66% of mango, 33% of orange, and 1% of lemon had highly desirable sensory scores for color (6.00), body (5.92), sweetness (5.68), and pleasantness (5.94). The aroma pattern of individual as well as combinations of fruit juices were also analyzed by electronic nose. The electronic nose could discriminate the aroma patterns of individual as well as fruit juice combinations by mixture design. The results provide information on the sensory quality of best fruit punch formulations liked by the consumer panel based on lemon, orange, and mango.

  4. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    Science.gov (United States)

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  5. Optimising mechanical strength and bulk density of dry ceramic bodies through mixture design

    OpenAIRE

    Correia, S. L.; Hotza, D.; Segadães, A. M.

    2005-01-01

    In industrial practice, it is desirable to be able to predict, in an expeditious way, what the effects of a change in raw materials or the proportions thereof might be in the various processing steps towards the final product. When the property of interest is basically determined by the combination (or mixture) of raw materials, an optimisation methodology specific to the design of mixture experiments can be successfully used. In the present study, dry bending strength and bulk density were s...

  6. Optimal designs for linear mixture models

    NARCIS (Netherlands)

    Mendieta, E.J.; Linssen, H.N.; Doornbos, R.

    1975-01-01

    In a recent paper Snee and Marquardt (1974) considered designs for linear mixture models, where the components are subject to individual lower and/or upper bounds. When the number of components is large their algorithm XVERT yields designs far too extensive for practical purposes. The purpose of

  7. Optimization of β-casein stabilized nanoemulsions using experimental mixture design.

    Science.gov (United States)

    Maher, Patrick G; Fenelon, Mark A; Zhou, Yankun; Kamrul Haque, Md; Roos, Yrjö H

    2011-10-01

    The objective of this study was to determine the effect of changing viscosity and glass transition temperature in the continuous phase of nanoemulsion systems on subsequent stability. Formulations comprising of β-casein (2.5%, 5%, 7.5%, and 10% w/w), lactose (0% to 20% w/w), and trehalose (0% to 20% w/w) were generated from Design of Experiments (DOE) software and tested for glass transition temperature and onset of ice-melting temperature in maximally freeze-concentrated state (T(g) ' & T(m) '), and viscosity (μ). Increasing β-casein content resulted in significant (P mixture design was used to predict the optimum levels of lactose and trehalose required to attain the minimum and maximum T(g) ' and viscosity in solution at fixed protein contents. These mixtures were used to form the continuous phase of β-casein stabilized nanoemulsions (10% w/w sunflower oil) prepared by microfluidization at 70 MPa. Nanoemulsions were analyzed for T(g) ' & T(m) ', as well as viscosity, mean particle size, and stability. Increasing levels of β-casein (2.5% to 10% w/w) resulted in a significant (P mixture DOE was successfully used to predict glass transition and rheological properties for development of a continuous phase for use in nanoemulsions. © 2011 Institute of Food Technologists®

  8. Optimal designs for linear mixture models

    NARCIS (Netherlands)

    Mendieta, E.J.; Linssen, H.N.; Doornbos, R.

    1975-01-01

    In a recent paper Snee and Marquardt [8] considered designs for linear mixture models, where the components are subject to individual lower and/or upper bounds. When the number of components is large their algorithm XVERT yields designs far too extensive for practical purposes. The purpose of this

  9. Development of grout formulations for 106-AN waste: Mixture-experiment results and analysis

    International Nuclear Information System (INIS)

    Spence, R.D.; McDaniel, E.W.; Anderson, C.M.; Lokken, R.O.; Piepel, G.F.

    1993-09-01

    Twenty potential ingredients were identified for use in developing a 106-AN grout formulation, and 18 were subsequently obtained and tested. Four ingredients: Type II-LA (moderate heat of hydration) Portland cement, Class F fly ash, attapulgite 150 drilling clay, and ground air-cooled blast-furnace slag (GABFS) -- were selected for developing the 106-AN grout formulations. A mixture experiment was designed and conducted around the following formulation: 2.5 lb of cement per gallon, 1.2 lb of fly ash per gallon, 0.8 lb of attapulgite per gallon, and 3.5 lb of GABFS per gallon. Reduced empirical models were generated from the results of the mixture experiment. These models were used to recommend several grout formulations for 106-AN. Westinghouse Hanford Company selected one of these formulations to be verified for use with 106-AN and a backup formulation in case problems arise with the first choice. This report presents the mixture-experimental results and leach data

  10. Diameter optimization of VLS-synthesized ZnO nanowires, using statistical design of experiment

    International Nuclear Information System (INIS)

    Shafiei, Sepideh; Nourbakhsh, Amirhasan; Ganjipour, Bahram; Zahedifar, Mostafa; Vakili-Nezhaad, Gholamreza

    2007-01-01

    The possibility of diameter optimization of ZnO nanowires by using statistical design of experiment (DoE) is investigated. In this study, nanowires were synthesized using a vapor-liquid-solid (VLS) growth method in a horizontal reactor. The effects of six synthesis parameters (synthesis time, synthesis temperature, thickness of gold layer, distance between ZnO holder and substrate, mass of ZnO and Ar flow rate) on the average diameter of a ZnO nanowire were examined using the fractional factorial design (FFD) coupled with response surface methodology (RSM). Using a 2 III 6-3 FFD, the main effects of the thickness of the gold layer, synthesis temperature and synthesis time were concluded to be the key factors influencing the diameter. Then Box-Behnken design (BBD) was exploited to create a response surface from the main factors. The total number of required runs for the DoE process is 25, 8 runs for FFD parameter screening and 17 runs for the response surface obtained by BBD. Three extra runs are done to confirm the predicted results

  11. Morphology optimization of CCVD-synthesized multiwall carbon nanotubes, using statistical design of experiments

    International Nuclear Information System (INIS)

    Nourbakhsh, Amirhasan; Ganjipour, Bahram; Zahedifar, Mostafa; Arzi, Ezatollah

    2007-01-01

    The possibility of optimization of morphological features of multiwall carbon nanotubes (MWCNTs) using the statistical design of experiments (DoE) is investigated. In this study, MWCNTs were synthesized using a catalytic chemical vapour deposition (CCVD) method in a horizontal reactor using acetylene as the carbon source. The effects of six synthesis parameters (synthesis time, synthesis temperature, catalyst mass, reduction time, acetylene flow rate and hydrogen flow rate) on the average diameter and mean rectilinear length (MRL) of carbon nanotubes were examined using fractional-factorial design (FFD) coupled with response surface methodology (RSM). Using a 2 III 6-3 FFD, the main effects of reaction temperature, hydrogen flow rate and chemical reduction time were concluded to be the key factors influencing the diameter and MRL of MWCNTs; then Box-Behnken design (BBD) was exploited to create a response surface from the main factors. The total number of required runs is 26: 8 runs are for FFD parameter screening, 17 runs are for the response surface obtained by the BBD, and the final run is used to confirm the predicted results

  12. Optimization and evaluation of clarithromycin floating tablets using experimental mixture design.

    Science.gov (United States)

    Uğurlu, Timucin; Karaçiçek, Uğur; Rayaman, Erkan

    2014-01-01

    The purpose of the study was to prepare and evaluate clarithromycin (CLA) floating tablets using experimental mixture design for treatment of Helicobacter pylori provided by prolonged gastric residence time and controlled plasma level. Ten different formulations were generated based on different molecular weight of hypromellose (HPMC K100, K4M, K15M) by using simplex lattice design (a sub-class of mixture design) with Minitab 16 software. Sodium bicarbonate and anhydrous citric acid were used as gas generating agents. Tablets were prepared by wet granulation technique. All of the process variables were fixed. Results of cumulative drug release at 8th h (CDR 8th) were statistically analyzed to get optimized formulation (OF). Optimized formulation, which gave floating lag time lower than 15 s and total floating time more than 10 h, was analyzed and compared with target for CDR 8th (80%). A good agreement was shown between predicted and actual values of CDR 8th with a variation lower than 1%. The activity of clarithromycin contained optimizedformula against H. pylori were quantified using well diffusion agar assay. Diameters of inhibition zones vs. log10 clarithromycin concentrations were plotted in order to obtain a standard curve and clarithromycin activity.

  13. Graphical evaluation of the ridge-type robust regression estimators in mixture experiments.

    Science.gov (United States)

    Erkoc, Ali; Emiroglu, Esra; Akay, Kadri Ulas

    2014-01-01

    In mixture experiments, estimation of the parameters is generally based on ordinary least squares (OLS). However, in the presence of multicollinearity and outliers, OLS can result in very poor estimates. In this case, effects due to the combined outlier-multicollinearity problem can be reduced to certain extent by using alternative approaches. One of these approaches is to use biased-robust regression techniques for the estimation of parameters. In this paper, we evaluate various ridge-type robust estimators in the cases where there are multicollinearity and outliers during the analysis of mixture experiments. Also, for selection of biasing parameter, we use fraction of design space plots for evaluating the effect of the ridge-type robust estimators with respect to the scaled mean squared error of prediction. The suggested graphical approach is illustrated on Hald cement data set.

  14. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  15. Leads Detection Using Mixture Statistical Distribution Based CRF Algorithm from Sentinel-1 Dual Polarization SAR Imagery

    Science.gov (United States)

    Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting

    2017-04-01

    Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a

  16. Design and verification of bituminous mixtures with the increased content of reclaimed asphalt pavement

    Science.gov (United States)

    Bańkowski, Wojciech; Król, Jan; Gałązka, Karol; Liphardt, Adam; Horodecka, Renata

    2018-05-01

    Recycling of bituminous pavements is an issue increasingly being discussed in Poland. The analysis of domestic and foreign experience indicates a need to develop this technology in our country, in particular the hot feeding and production technologies. Various steps are being taken in this direction, including research projects. One of them is the InnGA project entitled: “Reclaimed asphalt pavement: Innovative technology of bituminous mixtures using material from reclaimed asphalt pavement”. The paper presents the results of research involving the design of bituminous mixtures in accordance with the required properties and in excess of the content of reclaimed asphalt permitted by the technical guidelines. It presents selected bituminous mixtures with the content of RAP of up to 50% and the results of tests from verification of industrial production of those mixtures. The article discusses the details of the design process of mixtures with a high content of reclaimed asphalt, the carried out production tests and discusses the results of tests under the verification of industrial production. Testing included basic tests according to the Polish technical requirements of WT- 2 and the extended functional testing. The conducted tests and analyses helped to determine the usefulness of the developed bituminous mixtures for use in experimental sections and confirmed the possibility of using an increased amount of reclaimed asphalt up to 50% in mixtures intended for construction of national roads.

  17. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  18. Experimental design of mixture applied to study PVP hydrogels properties crosslinked by ionizing radiation

    Energy Technology Data Exchange (ETDEWEB)

    Alcantara, Mara Tania S.; Lugao, Ademar B., E-mail: maratalcantara@uol.com.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Taqueda, Maria Elena S. [Universidade de Sao Paulo (USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica

    2009-07-01

    Hydrogels are three dimensional hydrophilic crosslinked polymeric networks that have capacity to swell by absorbing water or biological fluids without dissolve. Hydrogels have been widely used in different application fields from agriculture, industry and in biomedicine. The properties of a hydrogel are extremely important in selecting which materials are suitable for a specific application. So mixtures can offer hydrogels with different properties to different applications. The PVP hydrogels were prepared by gamma radiation of an aqueous polymer solution and crosslinked by gamma ray, an effective and simple method for hydrogel formation that offers some advantages over the other techniques. In this work, a mixture experimental design was used to study the relationship between polymer cross-linking and swelling properties of PVP hydrogels with PEG as plasticizer and agar as gellifier. The gel fraction was measured for every mixture specified for the experiment D-optimal designs. (author)

  19. Experimental design of mixture applied to study PVP hydrogels properties crosslinked by ionizing radiation

    International Nuclear Information System (INIS)

    Alcantara, Mara Tania S.; Lugao, Ademar B.; Taqueda, Maria Elena S.

    2009-01-01

    Hydrogels are three dimensional hydrophilic crosslinked polymeric networks that have capacity to swell by absorbing water or biological fluids without dissolve. Hydrogels have been widely used in different application fields from agriculture, industry and in biomedicine. The properties of a hydrogel are extremely important in selecting which materials are suitable for a specific application. So mixtures can offer hydrogels with different properties to different applications. The PVP hydrogels were prepared by gamma radiation of an aqueous polymer solution and crosslinked by gamma ray, an effective and simple method for hydrogel formation that offers some advantages over the other techniques. In this work, a mixture experimental design was used to study the relationship between polymer cross-linking and swelling properties of PVP hydrogels with PEG as plasticizer and agar as gellifier. The gel fraction was measured for every mixture specified for the experiment D-optimal designs. (author)

  20. Fundamentals of statistical experimental design and analysis

    CERN Document Server

    Easterling, Robert G

    2015-01-01

    Professionals in all areas - business; government; the physical, life, and social sciences; engineering; medicine, etc. - benefit from using statistical experimental design to better understand their worlds and then use that understanding to improve the products, processes, and programs they are responsible for. This book aims to provide the practitioners of tomorrow with a memorable, easy to read, engaging guide to statistics and experimental design. This book uses examples, drawn from a variety of established texts, and embeds them in a business or scientific context, seasoned with a dash of humor, to emphasize the issues and ideas that led to the experiment and the what-do-we-do-next? steps after the experiment. Graphical data displays are emphasized as means of discovery and communication and formulas are minimized, with a focus on interpreting the results that software produce. The role of subject-matter knowledge, and passion, is also illustrated. The examples do not require specialized knowledge, and t...

  1. A turbulence model in mixtures. First part: Statistical description of mixture

    International Nuclear Information System (INIS)

    Besnard, D.

    1987-03-01

    Classical theory of mixtures gives a model for molecular mixtures. This kind of model is based on a small gradient approximation for concentration, temperature, and pression. We present here a mixture model, allowing for large gradients in the flow. We also show that, with a local balance assumption between material diffusion and flow gradients evolution, we obtain a model similar to those mentioned above [fr

  2. Optimization of phase feeding of starter, grower, and finisher diets for male broilers by mixture experimental design: forty-eight-day production period.

    Science.gov (United States)

    Roush, W B; Boykin, D; Branton, S L

    2004-08-01

    A mixture experiment, a variant of response surface methodology, was designed to determine the proportion of time to feed broiler starter (23% protein), grower (20% protein), and finisher (18% protein) diets to optimize production and processing variables based on a total production time of 48 d. Mixture designs are useful for proportion problems where the components of the experiment (i.e., length of time the diets were fed) add up to a unity (48 d). The experiment was conducted with day-old male Ross x Ross broiler chicks. The birds were placed 50 birds per pen in each of 60 pens. The experimental design was a 10-point augmented simplex-centroid (ASC) design with 6 replicates of each point. Each design point represented the portion(s) of the 48 d that each of the diets was fed. Formulation of the diets was based on NRC standards. At 49 d, each pen of birds was evaluated for production data including BW, feed conversion, and cost of feed consumed. Then, 6 birds were randomly selected from each pen for processing data. Processing variables included live weight, hot carcass weight, dressing percentage, fat pad percentage, and breast yield (pectoralis major and pectoralis minor weights). Production and processing data were fit to simplex regression models. Model terms determined not to be significant (P > 0.05) were removed. The models were found to be statistically adequate for analysis of the response surfaces. A compromise solution was calculated based on optimal constraints designated for the production and processing data. The results indicated that broilers fed a starter and finisher diet for 30 and 18 d, respectively, would meet the production and processing constraints. Trace plots showed that the production and processing variables were not very sensitive to the grower diet.

  3. DESIGNS FOR MIXTURE AND PROCESS VARIABLES APPLIED IN TABLET FORMULATIONS

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    Although there are several methods for the construction of a design for process variables and mixture variables, there are not very many methods which are suitable to combine mixture and process variables in one design. Some of the methods which are feasible will be shown. These methods will be

  4. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui

    2010-09-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  5. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui; Xie, Huizhi; Wu, Wenzhuo; Roshan Joseph, V.; Jeff Wu, C. F.; Wang, Zhong Lin

    2010-01-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  6. Humidifying system design of PEMFC test platform based on the mixture of dry and wet air

    Directory of Open Access Journals (Sweden)

    Tiancai Ma

    2015-01-01

    Full Text Available Based on the present humidifying system of PEMFC test platform, a novel design based on dry and wet air mixture is proposed. Key parameters are calculated, and test platform is built. Three experiments are implemented to test the performance of proposed design. Results show that the new design can meet the requirements, and realize the quick response and accurate control.

  7. Multivariate spatial Gaussian mixture modeling for statistical clustering of hemodynamic parameters in functional MRI

    International Nuclear Information System (INIS)

    Fouque, A.L.; Ciuciu, Ph.; Risser, L.; Fouque, A.L.; Ciuciu, Ph.; Risser, L.

    2009-01-01

    In this paper, a novel statistical parcellation of intra-subject functional MRI (fMRI) data is proposed. The key idea is to identify functionally homogenous regions of interest from their hemodynamic parameters. To this end, a non-parametric voxel-based estimation of hemodynamic response function is performed as a prerequisite. Then, the extracted hemodynamic features are entered as the input data of a Multivariate Spatial Gaussian Mixture Model (MSGMM) to be fitted. The goal of the spatial aspect is to favor the recovery of connected components in the mixture. Our statistical clustering approach is original in the sense that it extends existing works done on univariate spatially regularized Gaussian mixtures. A specific Gibbs sampler is derived to account for different covariance structures in the feature space. On realistic artificial fMRI datasets, it is shown that our algorithm is helpful for identifying a parsimonious functional parcellation required in the context of joint detection estimation of brain activity. This allows us to overcome the classical assumption of spatial stationarity of the BOLD signal model. (authors)

  8. Supercritical Water Mixture (SCWM) Experiment

    Science.gov (United States)

    Hicks, Michael C.; Hegde, Uday G.

    2012-01-01

    The subject presentation, entitled, Supercritical Water Mixture (SCWM) Experiment, was presented at the International Space Station (ISS) Increment 33/34 Science Symposium. This presentation provides an overview of an international collaboration between NASA and CNES to study the behavior of a dilute aqueous solution of Na2SO4 (5% w) at near-critical conditions. The Supercritical Water Mixture (SCWM) investigation, serves as important precursor work for subsequent Supercritical Water Oxidation (SCWO) experiments. The SCWM investigation will be performed in DECLICs High Temperature Insert (HTI) for the purpose of studying critical fluid phenomena at high temperatures and pressures. The HTI includes a completely sealed and integrated test cell (i.e., Sample Cell Unit SCU) that will contain approximately 0.3 ml of the aqueous test solution. During the sequence of tests, scheduled to be performed in FY13, temperatures and pressures will be elevated to critical conditions (i.e., Tc = 374C and Pc = 22 MPa) in order to observe salt precipitation, precipitate agglomeration and precipitate transport in the presence of a temperature gradient without the influences of gravitational forces. This presentation provides an overview of the motivation for this work, a description of the DECLIC HTI hardware, the proposed test sequences, and a brief discussion of the scientific research objectives.

  9. JET experiments with tritium and deuterium–tritium mixtures

    NARCIS (Netherlands)

    Horton, L.; Batistoni, P.; Boyer, H.; Challis, C.; Ciric, D.; Donne, A. J. H.; Eriksson, L. G.; Garcia, J.; Garzotti, L.; Gee, S.; Hobirk, J.; Joffrin, E.; Jones, T.; King, D. B.; Knipe, S.; Litaudon, X.; Matthews, G. F.; Monakhov, I.; Murari, A.; Nunes, I.; Riccardo, V.; Sips, A. C. C.; Warren, R.; Weisen, H.; Zastrow, K. D.

    2016-01-01

    Extensive preparations are now underway for an experiment in the Joint European Torus (JET) using tritium and deuterium–tritium mixtures. The goals of this experiment are described as well as the progress that has been made in developing plasma operational scenarios and physics reference pulses for

  10. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    Science.gov (United States)

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  11. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  12. Development of grout formulations for 106-AN waste: Mixture-experiment results and analysis

    International Nuclear Information System (INIS)

    Spence, R.D.; McDaniel, E.W.; Anderson, C.M.; Lokken, R.O.; Piepel, G.F.

    1993-09-01

    Twenty potential ingredients were identified for use in developing a 106-AN grout formulation, and 18 were subsequently obtained and tested. Four ingredients-Type II-LA (moderate heat of hydration) Portland cement, Class F fly ash, attapulgite 150 drilling clay, and ground air-cooled blast-furnace slag (GABFS) were selected for developing the 106-AN grout formulations. A mixture experiment was designed and conducted around the following formulation: 2.5 lb of cement per gallon, 1.2 lb of fly ash per gallon, 0.8 lb of attapulgite per gallon, and 3.5 lb of GABFS per gallon. Reduced empirical models were generated from the results of the mixture experiment. These models were used to recommend several grout formulations for 106-AN. Westinghouse Hanford Company selected one of these formulations to be verified for use with 106-AN and a backup formulation in case problems arise with the first choice

  13. Optimization of Asphalt Mixture Design for the Louisiana ALF Test Sections

    Science.gov (United States)

    2018-05-01

    This research presents an extensive study on the design and characterization of asphalt mixtures used in road pavements. Both mixture volumetrics and physical properties obtained from several laboratory tests were considered in optimizing the mixture...

  14. Qualitative criteria and thresholds for low noise asphalt mixture design

    Science.gov (United States)

    Vaitkus, A.; Andriejauskas, T.; Gražulytė, J.; Šernas, O.; Vorobjovas, V.; Kleizienė, R.

    2018-05-01

    Low noise asphalt pavements are cost efficient and cost effective alternative for road traffic noise mitigation comparing with noise barriers, façade insulation and other known noise mitigation measures. However, design of low noise asphalt mixtures strongly depends on climate and traffic peculiarities of different regions. Severe climate regions face problems related with short durability of low noise asphalt mixtures in terms of considerable negative impact of harsh climate conditions (frost-thaw, large temperature fluctuations, hydrological behaviour, etc.) and traffic (traffic loads, traffic volumes, studded tyres, etc.). Thus there is a need to find balance between mechanical and acoustical durability as well as to ensure adequate pavement skid resistance for road safety purposes. Paper presents analysis of the qualitative criteria and design parameters thresholds of low noise asphalt mixtures. Different asphalt mixture composition materials (grading, aggregate, binder, additives, etc.) and relevant asphalt layer properties (air void content, texture, evenness, degree of compaction, etc.) were investigated and assessed according their suitability for durable and effective low noise pavements. Paper concluded with the overview of requirements, qualitative criteria and thresholds for low noise asphalt mixture design for severe climate regions.

  15. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  16. Statistical aspects of quantitative real-time PCR experiment design

    Czech Academy of Sciences Publication Activity Database

    Kitchen, R.R.; Kubista, Mikael; Tichopád, Aleš

    2010-01-01

    Roč. 50, č. 4 (2010), s. 231-236 ISSN 1046-2023 R&D Projects: GA AV ČR IAA500520809 Institutional research plan: CEZ:AV0Z50520701 Keywords : Real-time PCR * Experiment design * Nested analysis of variance Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 4.527, year: 2010

  17. Applied statistical designs for the researcher

    CERN Document Server

    Paulson, Daryl S

    2003-01-01

    Research and Statistics Basic Review of Parametric Statistics Exploratory Data Analysis Two Sample Tests Completely Randomized One-Factor Analysis of Variance One and Two Restrictions on Randomization Completely Randomized Two-Factor Factorial Designs Two-Factor Factorial Completely Randomized Blocked Designs Useful Small Scale Pilot Designs Nested Statistical Designs Linear Regression Nonparametric Statistics Introduction to Research Synthesis and "Meta-Analysis" and Conclusory Remarks References Index.

  18. The Challenge of Peat Substitution in Organic Seedling Production: Optimization of Growing Media Formulation through Mixture Design and Response Surface Analysis.

    Directory of Open Access Journals (Sweden)

    Francesco Giovanni Ceglie

    Full Text Available Peat replacement is an increasing demand in containerized and transplant production, due to the environmental constraints associated to peat use. However, despite the wide information concerning the use of alternative materials as substrates, it is very complex to establish the best materials and mixtures. This work evaluates the use of mixture design and surface response methodology in a peat substitution experiment using two alternative materials (green compost and palm fibre trunk waste for transplant production of tomato (Lycopersicon esculentum Mill.; melon, (Cucumis melo L.; and lettuce (Lactuca sativa L. in organic farming conditions. In general, the substrates showed suitable properties for their use in seedling production, showing the best plant response the mixture of 20% green compost, 39% palm fibre and 31% peat. The mixture design and applied response surface methodology has shown to be an useful approach to optimize substrate formulations in peat substitution experiments to standardize plant responses.

  19. Gregor Mendel's Genetic Experiments: A Statistical Analysis after 150 Years

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2016-01-01

    Roč. 12, č. 2 (2016), s. 20-26 ISSN 1801-5603 Institutional support: RVO:67985807 Keywords : genetics * history of science * biostatistics * design of experiments Subject RIV: BB - Applied Statistics, Operational Research

  20. Frozen orientation disorder and rotation excitation in solid mixtures of methane and krypton (neutron diffraction experiments)

    International Nuclear Information System (INIS)

    Grondey, S.

    1986-09-01

    The effect of a statistical replacement of CH 4 molecules by Kr atoms on the rotational states in solid methane has been examined. Obviously the anisotropic molecular interaction (octopole-octopole interaction) is disturbed in a way analogous to magnetic systems with random internal fields. Inelastic neutron scattering experiments on solid mixtures (CH 4 ) 1-x Kr x with 0≤x≤0.35 have been carried out, and simple models have been developed to interpret the spectra. (orig./BHO)

  1. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  2. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  3. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  4. Modeling plant interspecific interactions from experiments with perennial crop mixtures to predict optimal combinations.

    Science.gov (United States)

    Halty, Virginia; Valdés, Matías; Tejera, Mauricio; Picasso, Valentín; Fort, Hugo

    2017-12-01

    The contribution of plant species richness to productivity and ecosystem functioning is a longstanding issue in ecology, with relevant implications for both conservation and agriculture. Both experiments and quantitative modeling are fundamental to the design of sustainable agroecosystems and the optimization of crop production. We modeled communities of perennial crop mixtures by using a generalized Lotka-Volterra model, i.e., a model such that the interspecific interactions are more general than purely competitive. We estimated model parameters -carrying capacities and interaction coefficients- from, respectively, the observed biomass of monocultures and bicultures measured in a large diversity experiment of seven perennial forage species in Iowa, United States. The sign and absolute value of the interaction coefficients showed that the biological interactions between species pairs included amensalism, competition, and parasitism (asymmetric positive-negative interaction), with various degrees of intensity. We tested the model fit by simulating the combinations of more than two species and comparing them with the polycultures experimental data. Overall, theoretical predictions are in good agreement with the experiments. Using this model, we also simulated species combinations that were not sown. From all possible mixtures (sown and not sown) we identified which are the most productive species combinations. Our results demonstrate that a combination of experiments and modeling can contribute to the design of sustainable agricultural systems in general and to the optimization of crop production in particular. © 2017 by the Ecological Society of America.

  5. Evaluation of 1H NMR metabolic profiling using biofluid mixture design.

    Science.gov (United States)

    Athersuch, Toby J; Malik, Shahid; Weljie, Aalim; Newton, Jack; Keun, Hector C

    2013-07-16

    A strategy for evaluating the performance of quantitative spectral analysis tools in conditions that better approximate background variation in a metabonomics experiment is presented. Three different urine samples were mixed in known proportions according to a {3, 3} simplex lattice experimental design and analyzed in triplicate by 1D (1)H NMR spectroscopy. Fifty-four urinary metabolites were subsequently quantified from the sample spectra using two methods common in metabolic profiling studies: (1) targeted spectral fitting and (2) targeted spectral integration. Multivariate analysis using partial least-squares (PLS) regression showed the latent structure of the spectral set recapitulated the experimental mixture design. The goodness-of-prediction statistic (Q(2)) of each metabolite variable in a PLS model was calculated as a metric for the reliability of measurement, across the sample compositional space. Several metabolites were observed to have low Q(2) values, largely as a consequence of their spectral resonances having low s/n or strong overlap with other sample components. This strategy has the potential to allow evaluation of spectral features obtained from metabolic profiling platforms in the context of the compositional background found in real biological sample sets, which may be subject to considerable variation. We suggest that it be incorporated into metabolic profiling studies to improve the estimation of matrix effects that confound accurate metabolite measurement. This novel method provides a rational basis for exploiting information from several samples in an efficient manner and avoids the use of multiple spike-in authentic standards, which may be difficult to obtain.

  6. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  7. Multiresponse optimisation on biodiesel obtained through a ternary mixture of vegetable oil and animal fat: Simplex-centroid mixture design application

    International Nuclear Information System (INIS)

    Orives, Juliane Resges; Galvan, Diego; Coppo, Rodolfo Lopes; Rodrigues, Cezar Henrique Furtoso; Angilelli, Karina Gomes; Borsato, Dionísio

    2014-01-01

    Highlights: • Mixture experimental design was used which allowed evaluating various responses. • Predictive equation was presented that allows verifying the behavior of the mixtures. • The results depicted that the obtained biodiesel dispensed the use of any additives. - Abstract: The quality of biodiesel is a determining factor in its commercialisation, and parameters such as the Cold Filter Plugging Point (CFPP) and Induction Period (IP) determine its operability in engines on cold days and storage time, respectively. These factors are important in characterisation of the final product. A B100 biodiesel formulation was developed using a multiresponse optimisation, for which the CFPP and cost were minimised, and the IP and yield were maximised. The experiments were carried out according to a simplex-centroid mixture design using soybean oil, beef tallow, and poultry fat. The optimum formulation consisted of 50% soybean oil, 20% beef tallow, and 30% poultry fat and had CFPP values of 1.92 °C, raw material costs of US$ 903.87 ton −1 , an IP of 8.28 h, and a yield of 95.68%. Validation was performed in triplicate and the t-test indicated that there were no difference between the estimated and experimental values for none of the dependent variables, thus indicating efficiency of the joint optimisation in the biodiesel production process that met the criteria for CFPP and IP, as well as high yield and low cost

  8. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a

  9. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Directory of Open Access Journals (Sweden)

    Patrick Wessa

    Full Text Available BACKGROUND: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses, which required us to develop a specific-purpose Statistical Learning Environment (SLE based on Reproducible Computing and newly developed Peer Review (PR technology. OBJECTIVES: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. METHODS: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. RESULTS: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student

  10. Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under

  11. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  12. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  13. Designing experiments for maximum information from cyclic oxidation tests and their statistical analysis using half Normal plots

    International Nuclear Information System (INIS)

    Coleman, S.Y.; Nicholls, J.R.

    2006-01-01

    Cyclic oxidation testing at elevated temperatures requires careful experimental design and the adoption of standard procedures to ensure reliable data. This is a major aim of the 'COTEST' research programme. Further, as such tests are both time consuming and costly, in terms of human effort, to take measurements over a large number of cycles, it is important to gain maximum information from a minimum number of tests (trials). This search for standardisation of cyclic oxidation conditions leads to a series of tests to determine the relative effects of cyclic parameters on the oxidation process. Following a review of the available literature, databases and the experience of partners to the COTEST project, the most influential parameters, upper dwell temperature (oxidation temperature) and time (hot time), lower dwell time (cold time) and environment, were investigated in partners' laboratories. It was decided to test upper dwell temperature at 3 levels, at and equidistant from a reference temperature; to test upper dwell time at a reference, a higher and a lower time; to test lower dwell time at a reference and a higher time and wet and dry environments. Thus an experiment, consisting of nine trials, was designed according to statistical criteria. The results of the trial were analysed statistically, to test the main linear and quadratic effects of upper dwell temperature and hot time and the main effects of lower dwell time (cold time) and environment. The nine trials are a quarter fraction of the 36 possible combinations of parameter levels that could have been studied. The results have been analysed by half Normal plots as there are only 2 degrees of freedom for the experimental error variance, which is rather low for a standard analysis of variance. Half Normal plots give a visual indication of which factors are statistically significant. In this experiment each trial has 3 replications, and the data are analysed in terms of mean mass change, oxidation kinetics

  14. Environmental analytical chemistry: Design of experiments

    International Nuclear Information System (INIS)

    Sanchez Alonso, F.

    1990-01-01

    The design of experiments is needed any time a work on analysis research or development is performed, in order to explain a physical phenomenon through a mathematical model or trying to optimize any kind of process. Therefore it results an unavoidable technique since multidimensional approximation are more economical and reliable. An empirical approximation is never so efficient and generally provides lower qualities. It is known as 'design of experiments' a group of mathematical-statistical techniques that have the maximum information about our problem and consequently the results obtained will have the maximum quality. The modelization of a physic phenomenon, the basic concepts in order to design the experiments and the analysis of results are studied in detail

  15. Optimization of prebiotics in soybean milk using mixture experiments

    Directory of Open Access Journals (Sweden)

    Kongkarn Kijroongrojana

    2009-11-01

    Full Text Available A mixture experiment was used to optimize prebiotic mixtures in soybean milk formulation. Inulin (I, galactooligosaccharides(GOS, and isomalto-oligosaccharides (IMO were the prebiotic ingredients added (4% w/v to soybean milk. Thirteen formulations of soybean milk were compared using the general descriptive analysis and the growth of probiotics(Bifidobacterium bifidum DSM 20456, Lactobacillus plantarum TISTR 875, and Lactobacillus acidophilus TISTR 1034. There were no significant differences (p>0.05 in all sensory attributes (color, thickness, beany flavor, sweetness, viscosity, sweetness aftertaste among the samples. Various mixtures of the prebiotics had only a slight effect on the soybean milk color and viscosity (p0.05. The soybean milk supplemented with the optimized prebiotic mixture had higher (p<0.05carbohydrates, total soluble solid, total solid content, and viscosity than the control (without prebiotic. However, it had a lower L* value (lightness and a higher a* value (redness than the control (p<0.05.

  16. Two polynomial representations of experimental design

    OpenAIRE

    Notari, Roberto; Riccomagno, Eva; Rogantin, Maria-Piera

    2007-01-01

    In the context of algebraic statistics an experimental design is described by a set of polynomials called the design ideal. This, in turn, is generated by finite sets of polynomials. Two types of generating sets are mostly used in the literature: Groebner bases and indicator functions. We briefly describe them both, how they are used in the analysis and planning of a design and how to switch between them. Examples include fractions of full factorial designs and designs for mixture experiments.

  17. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  18. Design and analysis of experiments

    CERN Document Server

    Dean, Angela; Draguljić, Danel

    2017-01-01

    This textbook takes a strategic approach to the broad-reaching subject of experimental design by identifying the objectives behind an experiment and teaching practical considerations that govern design and implementation, concepts that serve as the basis for the analytical techniques covered. Rather than a collection of miscellaneous approaches, chapters build on the planning, running, and analyzing of simple experiments in an approach that results from decades of teaching the subject. In most experiments, the procedures can be reproduced by readers, thus giving them a broad exposure to experiments that are simple enough to be followed through their entire course. Outlines of student and published experiments appear throughout the text and as exercises at the end of the chapters. The authors develop the theory of estimable functions and analysis of variance with detail, but at a mathematical level that is simultaneously approachable. Throughout the book, statistical aspects of analysis complement practical as...

  19. Liquids and liquid mixtures

    CERN Document Server

    Rowlinson, J S; Baldwin, J E; Buckingham, A D; Danishefsky, S

    2013-01-01

    Liquids and Liquid Mixtures, Third Edition explores the equilibrium properties of liquids and liquid mixtures and relates them to the properties of the constituent molecules using the methods of statistical thermodynamics. Topics covered include the critical state, fluid mixtures at high pressures, and the statistical thermodynamics of fluids and mixtures. This book consists of eight chapters and begins with an overview of the liquid state and the thermodynamic properties of liquids and liquid mixtures, including vapor pressure and heat capacities. The discussion then turns to the thermodynami

  20. Thermodynamic properties of fluid mixtures at high pressures and high temperatures. Application to high explosives and to phase diagrams of binary mixtures

    International Nuclear Information System (INIS)

    Pittion-Rossillon, Gerard

    1982-01-01

    The free energy for mixtures of about ten species which are chemically reacting is calculated. In order to have accurate results near the freezing line, excess properties are deduced from a modern statistical mechanics theory. Intermolecular potentials for like molecules are fitted to give good agreement with shock experiments in pure liquid samples, and mixture properties come naturally from the theory. The stationary Chapman-Jouguet detonation wave is calculated with a chemical equilibrium computer code and results are in good agreement with experiment for a lot of various explosives. One then study gas-gas equilibria in a binary mixture and show the extreme sensitivity of theoretical phase diagrams to the hypothesis of the model (author) [fr

  1. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  2. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  3. A semi-nonparametric mixture model for selecting functionally consistent proteins.

    Science.gov (United States)

    Yu, Lianbo; Doerge, Rw

    2010-09-28

    High-throughput technologies have led to a new era of proteomics. Although protein microarray experiments are becoming more common place there are a variety of experimental and statistical issues that have yet to be addressed, and that will carry over to new high-throughput technologies unless they are investigated. One of the largest of these challenges is the selection of functionally consistent proteins. We present a novel semi-nonparametric mixture model for classifying proteins as consistent or inconsistent while controlling the false discovery rate and the false non-discovery rate. The performance of the proposed approach is compared to current methods via simulation under a variety of experimental conditions. We provide a statistical method for selecting functionally consistent proteins in the context of protein microarray experiments, but the proposed semi-nonparametric mixture model method can certainly be generalized to solve other mixture data problems. The main advantage of this approach is that it provides the posterior probability of consistency for each protein.

  4. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  5. Undergraduate experiments on statistical optics

    International Nuclear Information System (INIS)

    Scholz, Ruediger; Friege, Gunnar; Weber, Kim-Alessandro

    2016-01-01

    Since the pioneering experiments of Forrester et al (1955 Phys. Rev. 99 1691) and Hanbury Brown and Twiss (1956 Nature 177 27; Nature 178 1046), along with the introduction of the laser in the 1960s, the systematic analysis of random fluctuations of optical fields has developed to become an indispensible part of physical optics for gaining insight into features of the fields. In 1985 Joseph W Goodman prefaced his textbook on statistical optics with a strong commitment to the ‘tools of probability and statistics’ (Goodman 2000 Statistical Optics (New York: John Wiley and Sons Inc.)) in the education of advanced optics. Since then a wide range of novel undergraduate optical counting experiments and corresponding pedagogical approaches have been introduced to underpin the rapid growth of the interest in coherence and photon statistics. We propose low cost experimental steps that are a fair way off ‘real’ quantum optics, but that give deep insight into random optical fluctuation phenomena: (1) the introduction of statistical methods into undergraduate university optical lab work, and (2) the connection between the photoelectrical signal and the characteristics of the light source. We describe three experiments and theoretical approaches which may be used to pave the way for a well balanced growth of knowledge, providing students with an opportunity to enhance their abilities to adapt the ‘tools of probability and statistics’. (paper)

  6. Evaluation of partially premixed turbulent flame stability from mixture fraction statistics in a slot burner

    KAUST Repository

    Kruse, Stephan

    2018-04-11

    Partially premixed combustion is characterized by mixture fraction inhomogeneity upstream of the reaction zone and occurs in many applied combustion systems. The temporal and spatial fluctuations of the mixture fraction have tremendous impact on the combustion characteristics, emission formation, and flame stability. In this study, turbulent partially premixed flames are experimentally studied in a slot burner configuration. The local temperature and gas composition is determined by means of one-dimensional, simultaneous detection of Rayleigh and Raman scattering. The statistics of the mixture fraction are utilized to characterize the impact of the Reynolds number, the global equivalence ratio, the progress of mixing within the flame, as well as the mixing length on the mixing field. Furthermore, these effects are evaluated by means of a regime diagram for partially premixed flames. In this study, it is shown that the increase of the mixing length results in a significantly more stable flame. The impact of the Reynolds number on flame stability is found to be minor.

  7. Evaluation of partially premixed turbulent flame stability from mixture fraction statistics in a slot burner

    KAUST Repository

    Kruse, Stephan; Mansour, Mohy S.; Elbaz, Ayman M.; Varea, Emilien; Grü nefeld, Gerd; Beeckmann, Joachim; Pitsch, Heinz

    2018-01-01

    Partially premixed combustion is characterized by mixture fraction inhomogeneity upstream of the reaction zone and occurs in many applied combustion systems. The temporal and spatial fluctuations of the mixture fraction have tremendous impact on the combustion characteristics, emission formation, and flame stability. In this study, turbulent partially premixed flames are experimentally studied in a slot burner configuration. The local temperature and gas composition is determined by means of one-dimensional, simultaneous detection of Rayleigh and Raman scattering. The statistics of the mixture fraction are utilized to characterize the impact of the Reynolds number, the global equivalence ratio, the progress of mixing within the flame, as well as the mixing length on the mixing field. Furthermore, these effects are evaluated by means of a regime diagram for partially premixed flames. In this study, it is shown that the increase of the mixing length results in a significantly more stable flame. The impact of the Reynolds number on flame stability is found to be minor.

  8. Augmenting Scheffe Linear Mixture Models With Squared and/or Crossproduct Terms

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Szychowski, Jeffrey M.; Loeppky, Jason L.

    2001-01-01

    A glass composition variation study (CVS) for high-level waste (HLW) stored at the Idaho National Engineering and Environmental Laboratory (INEEL) is being statistically designed and performed in phases over several years. The purpose of the CVS is to investigate and model how HLW-glass properties depend on glass composition within a glass composition region compatible with the expected range of INEEL HLW. The resulting glass property-composition models will be used to develop desirable glass formulations and other purposes. Phases 1 and 2 of the CVS have been completed so far, and are briefly described. The main focus of this paper is the CVS Phase 3 experimental design (test matrix). The Phase 3 experimental design was chosen to augment the Phase 1 and 2 data with additional data points, as well as to account for additional glass components of interest not studied in Phases 1 and/or 2. The paper describes how these Phase 3 experimental design augmentation challenges were addressed using the previous data, preliminary property-composition models, and statistical mixture experiment and optimal experimental design methods and software. The resulting Phase 3 experimental design of 30 simulated HAW glasses is presented and discussed

  9. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    Science.gov (United States)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  10. Statistical method for resolving the photon-photoelectron-counting inversion problem

    International Nuclear Information System (INIS)

    Wu Jinlong; Li Tiejun; Peng, Xiang; Guo Hong

    2011-01-01

    A statistical inversion method is proposed for the photon-photoelectron-counting statistics in quantum key distribution experiment. With the statistical viewpoint, this problem is equivalent to the parameter estimation for an infinite binomial mixture model. The coarse-graining idea and Bayesian methods are applied to deal with this ill-posed problem, which is a good simple example to show the successful application of the statistical methods to the inverse problem. Numerical results show the applicability of the proposed strategy. The coarse-graining idea for the infinite mixture models should be general to be used in the future.

  11. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    Science.gov (United States)

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Supercritical Water Mixture (SCWM) Experiment in the High Temperature Insert-Reflight (HTI-R)

    Science.gov (United States)

    Hicks, Michael C.; Hegde, Uday G.; Garrabos, Yves; Lecoutre, Carole; Zappoli, Bernard

    2013-01-01

    Current research on supercritical water processes on board the International Space Station (ISS) focuses on salt precipitation and transport in a test cell designed for supercritical water. This study, known as the Supercritical Water Mixture Experiment (SCWM) serves as a precursor experiment for developing a better understanding of inorganic salt precipitation and transport during supercritical water oxidation (SCWO) processes for the eventual application of this technology for waste management and resource reclamation in microgravity conditions. During typical SCWO reactions any inorganic salts present in the reactant stream will precipitate and begin to coat reactor surfaces and control mechanisms (e.g., valves) often severely impacting the systems performance. The SCWM experiment employs a Sample Cell Unit (SCU) filled with an aqueous solution of Na2SO4 0.5-w at the critical density and uses a refurbished High Temperature Insert, which was used in an earlier ISS experiment designed to study pure water at near-critical conditions. The insert, designated as the HTI-Reflight (HTI-R) will be deployed in the DECLIC (Device for the Study of Critical Liquids and Crystallization) Facility on the International Space Station (ISS). Objectives of the study include measurement of the shift in critical temperature due to the presence of the inorganic salt, assessment of the predominant mode of precipitation (i.e., heterogeneously on SCU surfaces or homogeneously in the bulk fluid), determination of the salt morphology including size and shapes of particulate clusters, and the determination of the dominant mode of transport of salt particles in the presence of an imposed temperature gradient. Initial results from the ISS experiments will be presented and compared to findings from laboratory experiments on the ground.

  13. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  14. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  15. Optimization of soy isoflavone extraction with different solvents using the simplex-centroid mixture design.

    Science.gov (United States)

    Yoshiara, Luciane Yuri; Madeira, Tiago Bervelieri; Delaroza, Fernanda; da Silva, Josemeyre Bonifácio; Ida, Elza Iouko

    2012-12-01

    The objective of this study was to optimize the extraction of different isoflavone forms (glycosidic, malonyl-glycosidic, aglycone and total) from defatted cotyledon soy flour using the simplex-centroid experimental design with four solvents of varying polarity (water, acetone, ethanol and acetonitrile). The obtained extracts were then analysed by high-performance liquid chromatography. The profile of the different soy isoflavones forms varied with different extractions solvents. Varying the solvent or mixture used, the extraction of different isoflavones was optimized using the centroid-simplex mixture design. The special cubic model best fitted to the four solvents and its combination for soy isoflavones extraction. For glycosidic isoflavones extraction, the polar ternary mixture (water, acetone and acetonitrile) achieved the best extraction; malonyl-glycosidic forms were better extracted with mixtures of water, acetone and ethanol. Aglycone isoflavones, water and acetone mixture were best extracted and total isoflavones, the best solvents were ternary mixture of water, acetone and ethanol.

  16. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    Directory of Open Access Journals (Sweden)

    Oberg Ann L

    2012-11-01

    Full Text Available Abstract Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  17. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    Science.gov (United States)

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  18. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  19. Optimal Design of Shock Tube Experiments for Parameter Inference

    KAUST Repository

    Bisetti, Fabrizio

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  20. Experimental design in chemistry: A tutorial.

    Science.gov (United States)

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  1. Phase equilibria for mixtures containing nonionic surfactant systems: Modeling and experiments

    International Nuclear Information System (INIS)

    Shin, Moon Sam; Kim, Hwayong

    2008-01-01

    Surfactants are important materials with numerous applications in the cosmetic, pharmaceutical, and food industries due to inter-associating and intra-associating bond. We present a lattice fluid equation-of-state that combines the quasi-chemical nonrandom lattice fluid model with Veytsman statistics for (intra + inter) molecular association to calculate phase behavior for mixtures containing nonionic surfactants. We also measured binary (vapor + liquid) equilibrium data for {2-butoxyethanol (C 4 E 1 ) + n-hexane} and {2-butoxyethanol (C 4 E 1 ) + n-heptane} systems at temperatures ranging from (303.15 to 323.15) K. A static apparatus was used in this study. The presented equation-of-state correlated well with the measured and published data for mixtures containing nonionic surfactant systems

  2. Quality improvement of melt extruded laminar systems using mixture design.

    Science.gov (United States)

    Hasa, D; Perissutti, B; Campisi, B; Grassi, M; Grabnar, I; Golob, S; Mian, M; Voinovich, D

    2015-07-30

    This study investigates the application of melt extrusion for the development of an oral retard formulation with a precise drug release over time. Since adjusting the formulation appears to be of the utmost importance in achieving the desired drug release patterns, different formulations of laminar extrudates were prepared according to the principles of Experimental Design, using a design for mixtures to assess the influence of formulation composition on the in vitro drug release from the extrudates after 1h and after 8h. The effect of each component on the two response variables was also studied. Ternary mixtures of theophylline (model drug), monohydrate lactose and microcrystalline wax (as thermoplastic binder) were extruded in a lab scale vertical ram extruder in absence of solvents at a temperature below the melting point of the binder (so that the crystalline state of the drug could be maintained), through a rectangular die to obtain suitable laminar systems. Thanks to the desirability approach and a reliability study for ensuring the quality of the formulation, a very restricted optimal zone was defined within the experimental domain. Among the mixture components, the variation of microcrystalline wax content played the most significant role in overall influence on the in vitro drug release. The formulation theophylline:lactose:wax, 57:14:29 (by weight), selected based on the desirability zone, was subsequently used for in vivo studies. The plasma profile, obtained after oral administration of the laminar extruded system in hard gelatine capsules, revealed the typical trend of an oral retard formulation. The application of the mixture experimental design associated to a desirability function permitted to optimize the extruded system and to determine the composition space that ensures final product quality. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Velocity limitations in coaxial plasma gun experiments with gas mixtures

    International Nuclear Information System (INIS)

    Axnaes, I.

    1976-04-01

    The velocity limitations found in many crossed field plasma experiments with neutral gas present are studied for binary mixtures of H 2 , He, N 2 O 2 , Ne and Ar. The apparatus used is a coaxial plasma gun with an azimuthal magnetic bias field. The discharge parameters are chosen so that the plasma is weakly ionized. In some of the mixtures it is found that one of the components tends to dominate in the sense that only a small amount (regarding volume) of that component is needed for the discharge to adopt a limiting velocity close to that for the pure component. Thus in a mixture between a heavy and a light component having nearly equal ionization potentials the heavy component dominates. Also if there is a considerable difference in ionization potential between the components, the component with the lowest ionization potential tends to dominate. (author)

  4. Yield and competition in barley variety mixtures

    Directory of Open Access Journals (Sweden)

    Kari Jokinen

    1991-09-01

    Full Text Available Competition between spring barley varieties and yield performance of two-, three and four-variety mixtures were studied in two replacement series field experiments. In the first experiment, repeated in three successive years (1983 —85 the components were the six-row varieties Agneta, Arra, Hja-673 and Porno. In the second experiment (1984, including two nitrogen doses (50 and 100 kgN/ha, both six-row (Agneta, Pomo and two-row (Ida, Kustaa varieties were used. Arra in the first and Agneta in the second experiment were the most competitive varieties. The results suggested that the fast growth of Arra at the beginning promoted its competitive ability. Increase in available nitrogen usually strengthened the competitiveness of Agneta. The observed competitive differences between varieties were not related to the earliness of a variety, neither to the morphological characters (two- and six-row varieties nor to the grain yield of a variety grown alone. The competitive ability was not always a stable character, the dominant suppression relationship varying from one environment to another (e.g. growing season, nitrogen dose. The observed overyielding was not statistically significant. The ratio of actual to expected yield and the relative yield total of several mixtures exceeded slightly one. As a conclusion, the yield advantage of mixtures was marginal. As a rule, the mixtures were not more stable than monocultures as determined by the coefficient of variation. However, the yield of some mixtures varied less than the yield of the most stable monoculture.

  5. Design and implementation of new design of numerical experiments for non linear models

    International Nuclear Information System (INIS)

    Gazut, St.

    2007-03-01

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  6. Vertical sorting in bed forms - flume experiments with a natural and a tri-modal sediment mixture

    NARCIS (Netherlands)

    Blom, Astrid; Ribberink, Jan S.; de Vriend, Huib J.

    2003-01-01

    Two sets of flume experiments were conducted to examine grain size selective transport and vertical sorting in conditions with migrating bed forms and bed load transport. In the two sets of experiments we used a sediment mixture from the river Rhine and a trimodal mixture, respectively. The vertical

  7. Using particle packing technology for sustainable concrete mixture design

    NARCIS (Netherlands)

    Fennis, S.A.A.M.; Walraven, J.C.

    2012-01-01

    The annual production of Portland cement, estimated at 3.4 billion tons in 2011, is responsible for about 7% of the total worldwide CO2-emission. To reduce this environmental impact it is important to use innovative technologies for the design of concrete structures and mixtures. In this paper, it

  8. Statistical methods in the mechanical design of fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)

  9. A Variational Statistical-Field Theory for Polar Liquid Mixtures

    Science.gov (United States)

    Zhuang, Bilin; Wang, Zhen-Gang

    Using a variational field-theoretic approach, we derive a molecularly-based theory for polar liquid mixtures. The resulting theory consists of simple algebraic expressions for the free energy of mixing and the dielectric constant as functions of mixture composition. Using only the dielectric constants and the molar volumes of the pure liquid constituents, the theory evaluates the mixture dielectric constants in good agreement with the experimental values for a wide range of liquid mixtures, without using adjustable parameters. In addition, the theory predicts that liquids with similar dielectric constants and molar volumes dissolve well in each other, while sufficient disparity in these parameters result in phase separation. The calculated miscibility map on the dielectric constant-molar volume axes agrees well with known experimental observations for a large number of liquid pairs. Thus the theory provides a quantification for the well-known empirical ``like-dissolves-like'' rule. Bz acknowledges the A-STAR fellowship for the financial support.

  10. Identifying overrepresented concepts in gene lists from literature: a statistical approach based on Poisson mixture model

    Directory of Open Access Journals (Sweden)

    Zhai Chengxiang

    2010-05-01

    Full Text Available Abstract Background Large-scale genomic studies often identify large gene lists, for example, the genes sharing the same expression patterns. The interpretation of these gene lists is generally achieved by extracting concepts overrepresented in the gene lists. This analysis often depends on manual annotation of genes based on controlled vocabularies, in particular, Gene Ontology (GO. However, the annotation of genes is a labor-intensive process; and the vocabularies are generally incomplete, leaving some important biological domains inadequately covered. Results We propose a statistical method that uses the primary literature, i.e. free-text, as the source to perform overrepresentation analysis. The method is based on a statistical framework of mixture model and addresses the methodological flaws in several existing programs. We implemented this method within a literature mining system, BeeSpace, taking advantage of its analysis environment and added features that facilitate the interactive analysis of gene sets. Through experimentation with several datasets, we showed that our program can effectively summarize the important conceptual themes of large gene sets, even when traditional GO-based analysis does not yield informative results. Conclusions We conclude that the current work will provide biologists with a tool that effectively complements the existing ones for overrepresentation analysis from genomic experiments. Our program, Genelist Analyzer, is freely available at: http://workerbee.igb.uiuc.edu:8080/BeeSpace/Search.jsp

  11. Geographical and Statistical Analysis on the Relationship between Land-Use Mixture and Home-Based Trip Making and More: Case of Richmond, Virginia

    Directory of Open Access Journals (Sweden)

    Yin - Shan MA

    2013-06-01

    Full Text Available Richmond, Virginia has implemented numerous mixed land-use policies to encourage non-private-vehicle commuting for decades based on the best practices of other cities and the assumption that land-use mixture would positively lead to trip reduction. This paper uses both Geographical Information Systems (GIS and statistical tools to empirically test this hypothesis. With local land use and trip making data as inputs, it first calculates two common indices of land-use mixture - entropy and dissimilarity indices, using GIS tool, supplemented by Microsoft Excel. Afterwards, it uses Statistical Package for Social Sciences (SPSS to calculate the correlation matrices among land-use mixture indices, socioeconomic variables, and home-based work/other trip rates, followed by a series of regression model runs on these variables. Through this study, it has been found that land-use mixture has some but weak effects on home-based work trip rate, and virtually no effects on home-based other trip rate. In contrast, socioeconomic variables, especially auto ownership, have larger effects on home-based trip making.

  12. Intermediate/Advanced Research Design and Statistics

    Science.gov (United States)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  13. A Simple Refraction Experiment for Probing Diffusion in Ternary Mixtures

    Science.gov (United States)

    Coutinho, Cecil A.; Mankidy, Bijith D.; Gupta, Vinay K.

    2010-01-01

    Diffusion is a fundamental phenomenon that is vital in many chemical processes such as mass transport in living cells, corrosion, and separations. We describe a simple undergraduate-level experiment based on Weiner's Method to probe diffusion in a ternary aqueous mixture of small molecular-weight molecules. As an illustration, the experiment…

  14. Toxicity of a binary mixture on Daphnia magna: biological effects of uranium and selenium isolated and in mixture

    International Nuclear Information System (INIS)

    Zeman, F.

    2008-10-01

    Among the multiple substances that affect freshwater ecosystems, uranium and selenium are two pollutants found worldwide in the environment, alone and in mixture. The aim of this thesis work was to investigate the effect of uranium and selenium mixture on daphnia (Daphnia magna). Studying effects of a mixture requires the assessment of the effect of single substances. Thus, the first experiments were performed on single substance. Acute toxicity data were obtained: EC 50 48h = 0, 39±0, 04 mg.L -1 for uranium and EC 50 48h 1, 86±0, 85 mg.L -1 for selenium. Chronic effects were also studied. Data on fecundity showed an EC 10 reproduction of 14±7 μg. L -1 for uranium and of 215±25 μg. L -1 for selenium. Uranium-selenium mixture toxicity experiments were performed and revealed an antagonistic effect. This study further demonstrates the importance of taking into consideration different elements in binary mixture studies such as the choice of reference models (concentration addition or independent action), statistical method, time exposure and endpoints. Using integrated parameters like energy budget was shown to be an interesting way to better understand interactions. An approach including calculation of chemical speciation in the medium and bioaccumulation measurements in the organism permits assumptions to be made on the nature of possible interactions between mixture components (toxico-dynamic et toxico-kinetic interactions). (author)

  15. Assessment of the recycling potential of fresh concrete waste using a factorial design of experiments.

    Science.gov (United States)

    Correia, S L; Souza, F L; Dienstmann, G; Segadães, A M

    2009-11-01

    Recycling of industrial wastes and by-products can help reduce the cost of waste treatment prior to disposal and eventually preserve natural resources and energy. To assess the recycling potential of a given waste, it is important to select a tool capable of giving clear indications either way, with the least time and work consumption, as is the case of modelling the system properties using the results obtained from statistical design of experiments. In this work, the aggregate reclaimed from the mud that results from washout and cleaning operations of fresh concrete mixer trucks (fresh concrete waste, FCW) was recycled into new concrete with various water/cement ratios, as replacement of natural fine aggregates. A 3(2) factorial design of experiments was used to model fresh concrete consistency index and hardened concrete water absorption and 7- and 28-day compressive strength, as functions of FCW content and water/cement ratio, and the resulting regression equations and contour plots were validated with confirmation experiments. The results showed that the fresh concrete workability worsened with the increase in FCW content but the water absorption (5-10 wt.%), 7-day compressive strength (26-36 MPa) and 28-day compressive strength (32-44 MPa) remained within the specified ranges, thus demonstrating that the aggregate reclaimed from FCW can be recycled into new concrete mixtures with lower natural aggregate content.

  16. Research of Deformation of Clay Soil Mixtures Mixtures

    OpenAIRE

    Romas Girkontas; Tadas Tamošiūnas; Andrius Savickas

    2014-01-01

    The aim of this article is to determine clay soils and clay soils mixtures deformations during drying. Experiments consisted from: a) clay and clay mixtures bridges (height ~ 0,30 m, span ~ 1,00 m); b) tiles of clay and clay, sand and straw (height, length, wide); c) cylinders of clay; clay and straw; clay, straw and sand (diameter; height). According to the findings recommendations for clay and clay mixtures drying technology application were presented. During the experiment clay bridge bear...

  17. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    International Nuclear Information System (INIS)

    Teuschler, Linda K.

    2007-01-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures

  18. Design of experiments for test of fuel element reliability

    International Nuclear Information System (INIS)

    Boehmert, J.; Juettner, C.; Linek, J.

    1989-01-01

    Changes of fuel element design and modifications of the operational conditions have to be tested in experiments and pilot projects for nuclear safety. Experimental design is an useful statistical method minimizing costs and risks for this procedure. The main problem of our work was to investigate the connection between failure rate of fuel elements, sample size, confidence interval, and error probability. Using the statistic model of the binomial distribution appropriate relations were derived and discussed. A stepwise procedure based on a modified sequential analysis according to Wald was developed as a strategy of introduction for modifications of the fuel element design and of the operational conditions. (author)

  19. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    Science.gov (United States)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  20. JET experiments with tritium and deuterium–tritium mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Horton, Lorne, E-mail: Lorne.Horton@jet.uk [JET Exploitation Unit, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); European Commission, B-1049 Brussels (Belgium); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, P. [Unità Tecnica Fusione - ENEA C. R. Frascati - via E. Fermi 45, Frascati (Roma), 00044, Frascati (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Boyer, H.; Challis, C.; Ćirić, D. [CCFE, Culham Science Centre, Abingdon OX14 3DB, Oxon (United Kingdom); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Donné, A.J.H. [EUROfusion Programme Management Unit, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); FOM Institute DIFFER, PO Box 1207, NL-3430 BE Nieuwegein (Netherlands); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Eriksson, L.-G. [European Commission, B-1049 Brussels (Belgium); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Garcia, J. [CEA, IRFM, F-13108 Saint Paul Lez Durance (France); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Garzotti, L.; Gee, S. [CCFE, Culham Science Centre, Abingdon OX14 3DB, Oxon (United Kingdom); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Hobirk, J. [Max-Planck-Institut für Plasmaphysik, D-85748 Garching (Germany); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Joffrin, E. [CEA, IRFM, F-13108 Saint Paul Lez Durance (France); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); and others

    2016-11-01

    Highlights: • JET is preparing for a series of experiments with tritium and deuterium–tritium mixtures. • Physics objectives include integrated demonstration of ITER operating scenarios, isotope and alpha physics. • Technology objectives include neutronics code validation, material studies and safety investigations. • Strong emphasis on gaining experience in operation of a nuclear tokamak and training scientists and engineers for ITER. - Abstract: Extensive preparations are now underway for an experiment in the Joint European Torus (JET) using tritium and deuterium–tritium mixtures. The goals of this experiment are described as well as the progress that has been made in developing plasma operational scenarios and physics reference pulses for use in deuterium–tritium and full tritium plasmas. At present, the high performance plasmas to be tested with tritium are based on either a conventional ELMy H-mode at high plasma current and magnetic field (operation at up to 4 MA and 4 T is being prepared) or the so-called improved H-mode or hybrid regime of operation in which high normalised plasma pressure at somewhat reduced plasma current results in enhanced energy confinement. Both of these regimes are being re-developed in conjunction with JET's ITER-like Wall (ILW) of beryllium and tungsten. The influence of the ILW on plasma operation and performance has been substantial. Considerable progress has been made on optimising performance with the all-metal wall. Indeed, operation at the (normalised) ITER reference confinement and pressure has been re-established in JET albeit not yet at high current. In parallel with the physics development, extensive technical preparations are being made to operate JET with tritium. The state and scope of these preparations is reviewed, including the work being done on the safety case for DT operation and on upgrading machine infrastructure and diagnostics. A specific example of the latter is the planned calibration at

  1. Removing lead from metallic mixture of waste printed circuit boards by vacuum distillation: factorial design and removal mechanism.

    Science.gov (United States)

    Li, Xingang; Gao, Yujie; Ding, Hui

    2013-10-01

    The lead removal from the metallic mixture of waste printed circuit boards by vacuum distillation was optimized using experimental design, and a mathematical model was established to elucidate the removal mechanism. The variables studied in lead evaporation consisted of the chamber pressure, heating temperature, heating time, particle size and initial mass. The low-level chamber pressure was fixed at 0.1 Pa as the operation pressure. The application of two-level factorial design generated a first-order polynomial that agreed well with the data for evaporation efficiency of lead. The heating temperature and heating time exhibited significant effects on the efficiency, which was validated by means of the copper-lead mixture experiments. The optimized operating conditions within the region studied were the chamber pressure of 0.1 Pa, heating temperature of 1023 K and heating time of 120 min. After the conditions were employed to remove lead from the metallic mixture of waste printed circuit boards, the efficiency was 99.97%. The mechanism of the effects was elucidated by mathematical modeling that deals with evaporation, mass transfer and condensation, and can be applied to a wider range of metal removal by vacuum distillation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Bayesian D-Optimal Choice Designs for Mixtures

    NARCIS (Netherlands)

    A. Ruseckaite (Aiste); P.P. Goos (Peter); D. Fok (Dennis)

    2014-01-01

    markdownabstract__Abstract__ Consumer products and services can often be described as mixtures of ingredients. Examples are the mixture of ingredients in a cocktail and the mixture of different components of waiting time (e.g., in-vehicle and out-of-vehicle travel time) in a transportation

  3. Applying Statistical Design to Control the Risk of Over-Design with Stochastic Simulation

    Directory of Open Access Journals (Sweden)

    Yi Wu

    2010-02-01

    Full Text Available By comparing a hard real-time system and a soft real-time system, this article elicits the risk of over-design in soft real-time system designing. To deal with this risk, a novel concept of statistical design is proposed. The statistical design is the process accurately accounting for and mitigating the effects of variation in part geometry and other environmental conditions, while at the same time optimizing a target performance factor. However, statistical design can be a very difficult and complex task when using clas-sical mathematical methods. Thus, a simulation methodology to optimize the design is proposed in order to bridge the gap between real-time analysis and optimization for robust and reliable system design.

  4. Will the alphabet soup of design criteria affect discrete choice experiment results?

    DEFF Research Database (Denmark)

    Olsen, Søren Bøye; Meyerhoff, Jürgen

    2017-01-01

    Every discrete choice experiment needs one, but the impacts of a statistical design on the results are still not well understood. Comparative studies have found that efficient designs outperform especially orthogonal designs. What has been little studied is whether efficient designs come at a cos...

  5. Mixture-amount design and response surface modeling to assess the effects of flavonoids and phenolic acids on developmental performance of Anastrepha ludens.

    Science.gov (United States)

    Pascacio-Villafán, Carlos; Lapointe, Stephen; Williams, Trevor; Sivinski, John; Niedz, Randall; Aluja, Martín

    2014-03-01

    Host plant resistance to insect attack and expansion of insect pests to novel hosts may to be modulated by phenolic compounds in host plants. Many studies have evaluated the role of phenolics in host plant resistance and the effect of phenolics on herbivore performance, but few studies have tested the joint effect of several compounds. Here, we used mixture-amount experimental design and response surface modeling to study the effects of a variety of phenolic compounds on the development and survival of Mexican fruit fly (Anastrepha ludens [Loew]), a notorious polyphagous pest of fruit crops that is likely to expand its distribution range under climate change scenarios. (+)- Catechin, phloridzin, rutin, chlorogenic acid, and p-coumaric acid were added individually or in mixtures at different concentrations to a laboratory diet used to rear individuals of A. ludens. No effect was observed with any mixture or concentration on percent pupation, pupal weight, adult emergence, or survival from neonate larvae to adults. Larval weight, larval and pupal developmental time, and the prevalence of adult deformities were affected by particular mixtures and concentrations of the compounds tested. We suggest that some combinations/concentrations of phenolic compounds could contribute to the management of A. ludens. We also highlight the importance of testing mixtures of plant secondary compounds when exploring their effects upon insect herbivore performance, and we show that mixture-amount design is a useful tool for this type of experiments.

  6. Combined application of mixture experimental design and artificial neural networks in the solid dispersion development.

    Science.gov (United States)

    Medarević, Djordje P; Kleinebudde, Peter; Djuriš, Jelena; Djurić, Zorica; Ibrić, Svetlana

    2016-01-01

    This study for the first time demonstrates combined application of mixture experimental design and artificial neural networks (ANNs) in the solid dispersions (SDs) development. Ternary carbamazepine-Soluplus®-poloxamer 188 SDs were prepared by solvent casting method to improve carbamazepine dissolution rate. The influence of the composition of prepared SDs on carbamazepine dissolution rate was evaluated using d-optimal mixture experimental design and multilayer perceptron ANNs. Physicochemical characterization proved the presence of the most stable carbamazepine polymorph III within the SD matrix. Ternary carbamazepine-Soluplus®-poloxamer 188 SDs significantly improved carbamazepine dissolution rate compared to pure drug. Models developed by ANNs and mixture experimental design well described the relationship between proportions of SD components and percentage of carbamazepine released after 10 (Q10) and 20 (Q20) min, wherein ANN model exhibit better predictability on test data set. Proportions of carbamazepine and poloxamer 188 exhibited the highest influence on carbamazepine release rate. The highest carbamazepine release rate was observed for SDs with the lowest proportions of carbamazepine and the highest proportions of poloxamer 188. ANNs and mixture experimental design can be used as powerful data modeling tools in the systematic development of SDs. Taking into account advantages and disadvantages of both techniques, their combined application should be encouraged.

  7. Comparative performance of conventional OPC concrete and HPC designed by densified mixture design algorithm

    Science.gov (United States)

    Huynh, Trong-Phuoc; Hwang, Chao-Lung; Yang, Shu-Ti

    2017-12-01

    This experimental study evaluated the performance of normal ordinary Portland cement (OPC) concrete and high-performance concrete (HPC) that were designed by the conventional method (ACI) and densified mixture design algorithm (DMDA) method, respectively. Engineering properties and durability performance of both the OPC and HPC samples were studied using the tests of workability, compressive strength, water absorption, ultrasonic pulse velocity, and electrical surface resistivity. Test results show that the HPC performed good fresh property and further showed better performance in terms of strength and durability as compared to the OPC.

  8. Demonstration of Pressurizing Coal/Biomass Mixtures Using Posimetric Solids Pump Technology

    Energy Technology Data Exchange (ETDEWEB)

    Westendorf, Tiffany; Acharya, Harish; Cui, Zhe; Furman, Anthony; Giammattei, Mark; Rader, Jeff; Vazquez, Arturo

    2012-12-31

    This document is the Final Technical Report for a project supported by U.S. DOE NETL (Contract No. DE-FE0000507), GE Global Research, GE Energy, and Idaho National Laboratory (INL). This report discusses key project accomplishments for the period beginning August 7, 2009 and ending December 31, 2012. In this project, pressurized delivery of coal/biomass mixtures using GE Posimetric* solids pump technology was achieved in pilot scale experiments. Coal/biomass mixtures containing 10-50 wt% biomass were fed against pressures of 65-450 psi. Pressure capability increased with decreasing biomass content for a given pump design, and was linked to the interaction of highly compressible coal/biomass mixtures with the pump outlet design. Biomass pretreatment specifications for particle size and moisture content were defined based on bench-scale flowability, compressibility, friction, and permeability experiments that mimic the behavior of the Posimetric pump. A preliminary economic assessment of biomass pretreatment and pump operation for coal/biomass mixtures (CBMs) was conducted.

  9. Optimization of primaquine diphosphate tablet formulation for controlled drug release using the mixture experimental design.

    Science.gov (United States)

    Duque, Marcelo Dutra; Kreidel, Rogério Nepomuceno; Taqueda, Maria Elena Santos; Baby, André Rolim; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Consiglieri, Vladi Olga

    2013-01-01

    A tablet formulation based on hydrophilic matrix with a controlled drug release was developed, and the effect of polymer concentrations on the release of primaquine diphosphate was evaluated. To achieve this purpose, a 20-run, four-factor with multiple constraints on the proportions of the components was employed to obtain tablet compositions. Drug release was determined by an in vitro dissolution study in phosphate buffer solution at pH 6.8. The polynomial fitted functions described the behavior of the mixture on simplex coordinate systems to study the effects of each factor (polymer) on tablet characteristics. Based on the response surface methodology, a tablet composition was optimized with the purpose of obtaining a primaquine diphosphate release closer to a zero order kinetic. This formulation released 85.22% of the drug for 8 h and its kinetic was studied regarding to Korsmeyer-Peppas model, (Adj-R(2) = 0.99295) which has confirmed that both diffusion and erosion were related to the mechanism of the drug release. The data from the optimized formulation were very close to the predictions from statistical analysis, demonstrating that mixture experimental design could be used to optimize primaquine diphosphate dissolution from hidroxypropylmethyl cellulose and polyethylene glycol matrix tablets.

  10. Impact of Chemical Proportions on the Acute Neurotoxicity of a Mixture of Seven Carbamates in Preweanling and Adult Rats

    Science.gov (United States)

    Statistical design and environmental relevance are important aspects of studies of chemical mixtures, such as pesticides. We used a dose-additivity model to test experimentally the default assumptions of dose-additivity for two mixtures of seven N-methylcarbamates (carbaryl, carb...

  11. Performance on Water Stability of Cement-Foamed Asphalt Cold Recycled Mixture

    Directory of Open Access Journals (Sweden)

    Li Junxiao

    2018-01-01

    Full Text Available Through designing the mixture proportion of foamed asphalt cold in-place recycled mixture combined with the water stability experiment, it shows that the addition of cement can obviously improve foamed asphalt mixture’s water stability and the best cement admixture is between 1% ~ 2%; Using digital imaging microscope and SEM technology, the mechanism of increasing on the intensity of foamed asphalt mixture resulted by adding cement was analyzed. It revealed that the cement hydration products contained in the foamed asphalt mixture hydrolyzed into space mesh structure and wrapped up the aggregate particle, this is the main reason that the cement can enhance the mixture’s intensity as well as the water stability. This research provides reference for cement admixture’s formulation in the designing of foamed asphalt cold in-place recycled mixture.

  12. Extracting Insights from Experience Designers to Enhance User Experience Design

    OpenAIRE

    Kremer, Simon; Lindemann, Udo

    2016-01-01

    User Experience (UX) summarizes how a user expects, perceives and assesses an encounter with a product. User Experience Design (UXD) aims at creating meaningful experiences. While UXD is a rather young discipline with-in product development and traditional processes predominate, other disciplines traditionally focus on creating experiences. We engaged with experience de-signers from the fields of arts, movies, sports, music and event management. By analyzing their working processes via interv...

  13. Optimization of marine waste based-growth media for microbial lipase production using mixture design methodology.

    Science.gov (United States)

    Sellami, Mohamed; Kedachi, Samiha; Frikha, Fakher; Miled, Nabil; Ben Rebah, Faouzi

    2013-01-01

    Lipase production by Staphylococcus xylosus and Rhizopus oryzae was investigated using a culture medium based on a mixture of synthetic medium and supernatants generated from tuna by-products and Ulva rigida biomass. The proportion of the three medium components was optimized using the simplex-centroid mixture design method (SCMD). Results indicated that the experimental data were in good agreement with predicted values, indicating that SCMD was a reliable method for determining the optimum mixture proportion of the growth medium. Maximal lipase activities of 12.5 and 23.5 IU/mL were obtained with a 50:50 (v:v) mixture of synthetic medium and tuna by-product supernatant for Staphylococcus xylosus and Rhizopus oryzae, respectively. The predicted responses from these mixture proportions were also validated experimentally.

  14. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO RAY MIXTURE.

    Science.gov (United States)

    Risk assessors are becoming increasingly aware of the importance of assessing interactions between chemicals in a mixture. Most traditional designs for evaluating interactions are prohibitive when the number of chemicals in the mixture is large. However, evaluation of interacti...

  15. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO MIXTURE RAY.

    Science.gov (United States)

    Humans are exposed to mixtures of environmental compounds. A regulatory assumption is that the mixtures of chemicals act in an additive manner. However, this assumption requires experimental validation. Traditional experimental designs (full factorial) require a large number of e...

  16. Performance on Water Stability of Cement-Foamed Asphalt Cold Recycled Mixture

    OpenAIRE

    Li Junxiao; Fu Wei; Zang Hechao

    2018-01-01

    Through designing the mixture proportion of foamed asphalt cold in-place recycled mixture combined with the water stability experiment, it shows that the addition of cement can obviously improve foamed asphalt mixture’s water stability and the best cement admixture is between 1% ~ 2%; Using digital imaging microscope and SEM technology, the mechanism of increasing on the intensity of foamed asphalt mixture resulted by adding cement was analyzed. It revealed that the cement hydration products ...

  17. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  18. The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning

    Science.gov (United States)

    Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan

    2017-01-01

    We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…

  19. Mixture toxicity revisited from a toxicogenomic perspective.

    Science.gov (United States)

    Altenburger, Rolf; Scholz, Stefan; Schmitt-Jansen, Mechthild; Busch, Wibke; Escher, Beate I

    2012-03-06

    The advent of new genomic techniques has raised expectations that central questions of mixture toxicology such as for mechanisms of low dose interactions can now be answered. This review provides an overview on experimental studies from the past decade that address diagnostic and/or mechanistic questions regarding the combined effects of chemical mixtures using toxicogenomic techniques. From 2002 to 2011, 41 studies were published with a focus on mixture toxicity assessment. Primarily multiplexed quantification of gene transcripts was performed, though metabolomic and proteomic analysis of joint exposures have also been undertaken. It is now standard to explicitly state criteria for selecting concentrations and provide insight into data transformation and statistical treatment with respect to minimizing sources of undue variability. Bioinformatic analysis of toxicogenomic data, by contrast, is still a field with diverse and rapidly evolving tools. The reported combined effect assessments are discussed in the light of established toxicological dose-response and mixture toxicity models. Receptor-based assays seem to be the most advanced toward establishing quantitative relationships between exposure and biological responses. Often transcriptomic responses are discussed based on the presence or absence of signals, where the interpretation may remain ambiguous due to methodological problems. The majority of mixture studies design their studies to compare the recorded mixture outcome against responses for individual components only. This stands in stark contrast to our existing understanding of joint biological activity at the levels of chemical target interactions and apical combined effects. By joining established mixture effect models with toxicokinetic and -dynamic thinking, we suggest a conceptual framework that may help to overcome the current limitation of providing mainly anecdotal evidence on mixture effects. To achieve this we suggest (i) to design studies to

  20. High productivity chromatography refolding process for Hepatitis B Virus X (HBx) protein guided by statistical design of experiment studies.

    Science.gov (United States)

    Basu, Anindya; Leong, Susanna Su Jan

    2012-02-03

    The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Application of simplex-centroid mixture design to optimize stabilizer combinations for ice cream manufacture.

    Science.gov (United States)

    BahramParvar, Maryam; Tehrani, Mostafa Mazaheri; Razavi, Seyed M A; Koocheki, Arash

    2015-03-01

    This study aimed to obtain the optimum formulation for stabilizers in ice cream that could contest with blends presented nowadays. Thus, different mixtures of three stabilizers, i.e. basil seed gum, carboxymethyl cellulose, and guar gum, at two concentrations (0.15 % & 0.35 %) were studied using mixture design methodology. The influence of these mixtures on some properties of ice cream and the regression models for them were also determined. Generally, high ratios of basil seed gum in mixture developed the apparent viscosity of ice cream mixes and decreased the melting rate. Increasing proportion of this stabilizer as well as guar gum in the mixtures at concentration of 0.15 % enhanced the overrun of samples. Based on the optimization criteria, the most excellent combination was 84.43 % basil seed gum and 15.57 % guar gum at concentration of 0.15 %. This research proved the capability of basil seed gum as a novel stabilizer in ice cream stabilization.

  2. Transition from hydrodynamic to fast sound in a He-Ne mixture a neutron Brillouin scattering experiment

    CERN Document Server

    Bafile, U; Barocchi, F; Sampoli, M

    2002-01-01

    The presence of a fast-sound mode in the microscopic dynamics of the rare-gas mixture He-Ne, predicted by theoretical studies and molecular-dynamics simulations, was demonstrated by an inelastic neutron scattering experiment. In order to study the transition between the fast and the normal acoustic modes in the hydrodynamic regime, k values lower by about one order of magnitude than in the usual experiments have to be probed. We describe here the results of the first neutron Brillouin scattering experiment performed with this purpose on the same system already investigated at larger k. The results of both experiments, together with those of a new molecular-dynamics simulation, provide a complete and consistent description, still missing so far, of the onset of fast-sound propagation in a binary mixture. (orig.)

  3. Bayesian optimal experimental design for the Shock-tube experiment

    International Nuclear Information System (INIS)

    Terejanu, G; Bryant, C M; Miki, K

    2013-01-01

    The sequential optimal experimental design formulated as an information-theoretic sensitivity analysis is applied to the ignition delay problem using real experimental. The optimal design is obtained by maximizing the statistical dependence between the model parameters and observables, which is quantified in this study using mutual information. This is naturally posed in the Bayesian framework. The study shows that by monitoring the information gain after each measurement update, one can design a stopping criteria for the experimental process which gives a minimal set of experiments to efficiently learn the Arrhenius parameters.

  4. Partial least squares analysis and mixture design for the study of the influence of composition variables on lipidic nanoparticle characteristics.

    Science.gov (United States)

    Malzert-Fréon, A; Hennequin, D; Rault, S

    2010-11-01

    Lipidic nanoparticles (NP), formulated from a phase inversion temperature process, have been studied with chemometric techniques to emphasize the influence of the four major components (Solutol®, Labrasol®, Labrafac®, water) on their average diameter and their distribution in size. Typically, these NP present a monodisperse size lower than 200 nm, as determined by dynamic light scattering measurements. From the application of the partial least squares (PLS) regression technique to the experimental data collected during definition of the feasibility zone, it was established that NP present a core-shell structure where Labrasol® is well encapsulated and contributes to the structuring of the NP. Even if this solubility enhancer is regarded as a pure surfactant in the literature, it appears that the oil moieties of this macrogolglyceride mixture significantly influence its properties. Furthermore, results have shown that PLS technique can be also used for predictions of sizes for given relative proportions of components and it was established that from a mixture design, the quantitative mixture composition to use in order to reach a targeted size and a targeted polydispersity index (PDI) can be easily predicted. Hence, statistical models can be a useful tool to control and optimize the characteristics in size of NP. © 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  5. Introduction to the special section on mixture modeling in personality assessment.

    Science.gov (United States)

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Latent variable models offer a conceptual and statistical framework for evaluating the underlying structure of psychological constructs, including personality and psychopathology. Complex structures that combine or compare categorical and dimensional latent variables can be accommodated using mixture modeling approaches, which provide a powerful framework for testing nuanced theories about psychological structure. This special series includes introductory primers on cross-sectional and longitudinal mixture modeling, in addition to empirical examples applying these techniques to real-world data collected in clinical settings. This group of articles is designed to introduce personality assessment scientists and practitioners to a general latent variable framework that we hope will stimulate new research and application of mixture models to the assessment of personality and its pathology.

  6. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  7. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies.

    Science.gov (United States)

    Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander

    2017-09-09

    The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  8. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  9. Designing of Synergistic Waste Mixtures for Multiphase Reactive Smelting

    Directory of Open Access Journals (Sweden)

    Vaso Manojlović

    2016-06-01

    Full Text Available Electric arc furnace (EAF dust, together with a mill scale and coke were smelted in a laboratory electric arc furnace. These metallurgical wastes consist of a many different phases and elements, making the reaction process complex. Thermo-chemical analysis of the reactions in metal, slag, and gas phases was done, and used for modeling of the mixture composition and energy consumption required for smelting. Modelling was performed with the software named RikiAlC. The crude ZnO, slag, and metal phase were analyzed using the atomic absorption spectrometry (AAS, the optical emission spectrometry with inductively coupled plasma (ICP-OES, the X-ray diffraction (XRD, the scanning electron microscopy (SEM equipped with energy dispersive spectrometry (EDS, and reflected and transmitted light microscopy. Also, in order to follow the behavior of this process the exhausted gases were monitored. The synergetic effects of the designed mixture may be recognized in minimizing energy consumption for the smelting process, improving the product yield efficiency, and reducing the negative environmental effects.

  10. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  11. Statistical phenomena - experiments results. II

    International Nuclear Information System (INIS)

    Schnell, W.

    1977-01-01

    The stochastic cooling of proton and antiproton beams is discussed. Stochastic cooling is the gradual reduction of emittance of a coasting beam by a feedback system, sensing and correcting the statistical fluctuations of the beam's position or momentum. The correction at every turn can be partial or complete. Transverse and longitudinal emittance of the beam are considered and the systems designed to cool the beams are described. (B.D.)

  12. Phase equilibria for mixtures containing very many components. development and application of continuous thermodynamics for chemical process design

    International Nuclear Information System (INIS)

    Cotterman, R.L.; Bender, R.; Prausnitz, J.M.

    1984-01-01

    For some multicomponent mixtures, where detailed chemical analysis is not feasible, the compositio of the mixture may be described by a continuous distribution function of some convenient macroscopic property suc as normal boiling point or molecular weight. To attain a quantitative description of phase equilibria for such mixtures, this work has developed thermodynamic procedures for continuous systems; that procedure is called continuous thermodynamics. To illustrate, continuous thermodynamics is used to calculate dew points for natural-gas mixtures, solvent loss in a high-pressure absorber, and liquid-liquid phase equilibria in a polymer fractionation process. Continuous thermodynamics provides a rational method for calculating phase equilibria for those mixtures where complete chemical analysis is not available but where composition can be given by some statistical description. While continuous thermodynamics is only the logical limit of the well-known pseudo-component method, it is more efficient than that method because it is less arbitrary and it often requires less computer time

  13. NMRI Measurements of Flow of Granular Mixtures

    Science.gov (United States)

    Nakagawa, Masami; Waggoner, R. Allen; Fukushima, Eiichi

    1996-01-01

    We investigate complex 3D behavior of granular mixtures in shaking and shearing devices. NMRI can non-invasively measure concentration, velocity, and velocity fluctuations of flows of suitable particles. We investigate origins of wall-shear induced convection flow of single component particles by measuring the flow and fluctuating motion of particles near rough boundaries. We also investigate if a mixture of different size particles segregate into their own species under the influence of external shaking and shearing disturbances. These non-invasive measurements will reveal true nature of convecting flow properties and wall disturbance. For experiments in a reduced gravity environment, we will design a light weight NMR imager. The proof of principle development will prepare for the construction of a complete spaceborne system to perform experiments in space.

  14. Precise Composition Tailoring of Mixed-Cation Hybrid Perovskites for Efficient Solar Cells by Mixture Design Methods.

    Science.gov (United States)

    Li, Liang; Liu, Na; Xu, Ziqi; Chen, Qi; Wang, Xindong; Zhou, Huanping

    2017-09-26

    Mixed anion/cation perovskites absorber has been recently implemented to construct highly efficient single junction solar cells and tandem devices. However, considerable efforts are still required to map the composition-property relationship of the mixed perovskites absorber, which is essential to facilitate device design. Here we report the intensive exploration of mixed-cation perovskites in their compositional space with the assistance of a rational mixture design (MD) methods. Different from the previous linear search of the cation ratios, it is found that by employing the MD methods, the ternary composition can be tuned simultaneously following simplex lattice designs or simplex-centroid designs, which enable significantly reduced experiment/sampling size to unveil the composition-property relationship for mixed perovskite materials and to boost the resultant device efficiency. We illustrated the composition-property relationship of the mixed perovskites in multidimension and achieved an optimized power conversion efficiency of 20.99% in the corresponding device. Moreover, the method is demonstrated to be feasible to help adjust the bandgap through rational materials design, which can be further extended to other materials systems, not limited in polycrystalline perovskites films for photovoltaic applications only.

  15. Time-of-flight experiments using a pseudo-statistical chopper

    International Nuclear Information System (INIS)

    Aizawa, Otohiko; Kanda, Keiji

    1975-01-01

    A ''pseudo-statistical'' chopper was manufactured and used for the experiments on neutron transmission and scattering. The characteristics of the chopper and the experimental results are discussed in comparison with those in the time-of-flight technique using a conventional chopper. Which of the two methods is superior depends on the form of the time-of-flight distribution to be measured. Pseudo-statistical pulsing may be especially advantageous for scattering experiments with single or a few-line time-of-flight spectrum. (auth.)

  16. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies

    Directory of Open Access Journals (Sweden)

    Anne-Laure Boulesteix

    2017-09-01

    Full Text Available Abstract Background The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly “evidence-based”. Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. Main message In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of “evidence-based” statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. Conclusion We suggest that benchmark studies—a method of assessment of statistical methods using real-world datasets—might benefit from adopting (some concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  17. Accounting for variation in designing greenhouse experiments with special reference to greenhouses containing plants on conveyor systems

    Science.gov (United States)

    2013-01-01

    Background There are a number of unresolved issues in the design of experiments in greenhouses. They include whether statistical designs should be used and, if so, which designs should be used. Also, are there thigmomorphogenic or other effects arising from the movement of plants on conveyor belts within a greenhouse? A two-phase, single-line wheat experiment involving four tactics was conducted in a conventional greenhouse and a fully-automated phenotyping greenhouse (Smarthouse) to investigate these issues. Results and discussion Analyses of our experiment show that there was a small east–west trend in total area of the plants in the Smarthouse. Analyses of the data from three multiline experiments reveal a large north–south trend. In the single-line experiment, there was no evidence of differences between trios of lanes, nor of movement effects. Swapping plant positions during the trial was found to decrease the east–west trend, but at the cost of increased error variance. The movement of plants in a north–south direction, through a shaded area for an equal amount of time, nullified the north–south trend. An investigation of alternative experimental designs for equally-replicated experiments revealed that generally designs with smaller blocks performed best, but that (nearly) trend-free designs can be effective when blocks are larger. Conclusions To account for variation in microclimate in a greenhouse, using statistical design and analysis is better than rearranging the position of plants during the experiment. For the relocation of plants to be successful requires that plants spend an equal amount of time in each microclimate, preferably during comparable growth stages. Even then, there is no evidence that this will be any more precise than statistical design and analysis of the experiment, and the risk is that it will not be successful at all. As for statistical design and analysis, it is best to use either (i) smaller blocks, (ii) (nearly) trend

  18. Optimisation of synergistic biomass-degrading enzyme systems for efficient rice straw hydrolysis using an experimental mixture design.

    Science.gov (United States)

    Suwannarangsee, Surisa; Bunterngsook, Benjarat; Arnthong, Jantima; Paemanee, Atchara; Thamchaipenet, Arinthip; Eurwilaichitr, Lily; Laosiripojana, Navadol; Champreda, Verawat

    2012-09-01

    Synergistic enzyme system for the hydrolysis of alkali-pretreated rice straw was optimised based on the synergy of crude fungal enzyme extracts with a commercial cellulase (Celluclast™). Among 13 enzyme extracts, the enzyme preparation from Aspergillus aculeatus BCC 199 exhibited the highest level of synergy with Celluclast™. This synergy was based on the complementary cellulolytic and hemicellulolytic activities of the BCC 199 enzyme extract. A mixture design was used to optimise the ternary enzyme complex based on the synergistic enzyme mixture with Bacillus subtilis expansin. Using the full cubic model, the optimal formulation of the enzyme mixture was predicted to the percentage of Celluclast™: BCC 199: expansin=41.4:37.0:21.6, which produced 769 mg reducing sugar/g biomass using 2.82 FPU/g enzymes. This work demonstrated the use of a systematic approach for the design and optimisation of a synergistic enzyme mixture of fungal enzymes and expansin for lignocellulosic degradation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Design and Statistics in Quantitative Translation (Process) Research

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

  20. On the application of design of experiments to accelerated life testing

    International Nuclear Information System (INIS)

    Hakim-Mashhadi, M.

    1992-01-01

    Today, there is an increasing demand for improved quality and reliability due to increasing system complexity and increasing demands from customer. Continuous improvement of quality is not only a means of competition but also a matter of staying in the market. Accelerated life testing and statistical design of experiments are two needed methods for improvement of quality. The combined use of them is very advantageous and increases the test efficiency. Accelerated life testing is a quick way to provide information on the life distribution of materials and products. By subjecting the test unit to conditions more severe than those at normal usage, the test time can be highly reduced. Estimates of life at normal stress levels are obtained by extrapolating the available information through a reasonable acceleration model. Accelerated life testing has mostly been used to measure reliability but it is high time to use it for improvement of quality. Design of experiments serves to find out the effect of design parameters and other interesting factors on performance measure and its variability. The obtained information is essential for a continuous improvement of quality. As an illustration, two sets of experiment are designed and performed at highly increased stress levels. The results are analysed and discussed and a time saving alternative is proposed. The combination of experimental design and accelerated life testing is discussed and illustrated. The combined use of these methods can be argued for in two different cases. One is for an exploratory improvement investigation and the other is for verification of reliability. In either case, the combined use is advantageous and improves the testing efficiency. Some general conclusions are drawn to be used for planning and performance of statistically designed accelerated life testing experiments. (70 refs.) (au)

  1. Usage of a statistical method of designing factorial experiments in the mechanical activation of a complex CuPbZn sulphide concentrate

    Directory of Open Access Journals (Sweden)

    BalហPeter

    2003-09-01

    Full Text Available Mechanical activation belongs to innovative procedures which intensify technological processes by creating new surfaces and making a defective structure of solid phase. Mechanical impact on the solid phase is a suitable procedure to ensure the mobility of its structure elements and to accumulate the mechanical energy that is later used in the processes of leaching.The aim of this study was to realize the mechanical activation of a complex CuPbZn sulphide concentrate (Slovak deposit in an attritor by using of statistical methods for the design of factorial experiments and to determine the conditions for preparing the optimum mechanically activated sample of studied concentrate.The following parameters of the attritor were studied as variables:the weight of sample/steel balls (degree of mill filling, the number of revolutions of the milling shaft and the time of mechanical activation. Interpretation of the chosen variables inducing the mechanical activation of the complex CuPbZn concentrate was also carried out by using statistical methods of factorial design experiments. The presented linear model (23 factorial experiment does not support directly the optimum search, therefore this model was extended to the nonlinear model by the utilization of second order ortogonal polynom. This nonlinear model does not describe adequately the process of new surface formation by the mechanical activation of the studied concentrate. It would be necessary to extend the presented nonlinear model to the nonlinear model of the third order or choose another model. In regard to the economy with the aspect of minimal energy input consumption, the sample with the value of 524 kWht-1 and with the maximum value of specific surface area 8.59 m2g-1 (as a response of the factorial experiment was chosen as the optimum mechanically activated sample of the studied concentrate. The optimum mechanically activated sample of the complex CuPbZn sulphide concentrate was prepared

  2. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  3. Use of a mixture statistical model in studying malaria vectors density.

    Directory of Open Access Journals (Sweden)

    Olayidé Boussari

    Full Text Available Vector control is a major step in the process of malaria control and elimination. This requires vector counts and appropriate statistical analyses of these counts. However, vector counts are often overdispersed. A non-parametric mixture of Poisson model (NPMP is proposed to allow for overdispersion and better describe vector distribution. Mosquito collections using the Human Landing Catches as well as collection of environmental and climatic data were carried out from January to December 2009 in 28 villages in Southern Benin. A NPMP regression model with "village" as random effect is used to test statistical correlations between malaria vectors density and environmental and climatic factors. Furthermore, the villages were ranked using the latent classes derived from the NPMP model. Based on this classification of the villages, the impacts of four vector control strategies implemented in the villages were compared. Vector counts were highly variable and overdispersed with important proportion of zeros (75%. The NPMP model had a good aptitude to predict the observed values and showed that: i proximity to freshwater body, market gardening, and high levels of rain were associated with high vector density; ii water conveyance, cattle breeding, vegetation index were associated with low vector density. The 28 villages could then be ranked according to the mean vector number as estimated by the random part of the model after adjustment on all covariates. The NPMP model made it possible to describe the distribution of the vector across the study area. The villages were ranked according to the mean vector density after taking into account the most important covariates. This study demonstrates the necessity and possibility of adapting methods of vector counting and sampling to each setting.

  4. Designing the KNK II-TOAST irradiation experiment with the saturn-FS code

    International Nuclear Information System (INIS)

    Ritzhaupt-Kleissl, H.J.; Elbel, H.; Heck, M.

    1991-01-01

    In order to study the existing specification of FBR fuel with respect to allowable fabrication tolerances with the objective to reduce the expense of fabrication and quality control, the TOAST irradiation experiment will be carried out in the 3 rd core of the KNK II. This experiment shall investigate the influence of the following fuel specification parameters on the operational behaviour: - Fuel diameter - Stoichiometry - Sintering atmosphere - Fill gas in the fuel pin. The combination of these test parameters led to a fabrication of 6 types of fuel pellets, giving together with two fill gas mixtures a total of 9 fuel pin types. Design calculations in the frame of the standard licensing procedure have been performed with the SATURN-FS fuel pin behaviour code. These calculations have been done for the steady-state behaviour as well as for some defined design transients, such as startup procedures and overpower ramps

  5. Optimizing an experimental design for an electromagnetic experiment

    Science.gov (United States)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  6. Combustion Mode Design with High Efficiency and Low Emissions Controlled by Mixtures Stratification and Fuel Reactivity

    Directory of Open Access Journals (Sweden)

    Hu eWang

    2015-08-01

    Full Text Available This paper presents a review on the combustion mode design with high efficiency and low emissions controlled by fuel reactivity and mixture stratification that have been conducted in the authors’ group, including the charge reactivity controlled homogeneous charge compression ignition (HCCI combustion, stratification controlled premixed charge compression ignition (PCCI combustion, and dual-fuel combustion concepts controlled by both fuel reactivity and mixture stratification. The review starts with the charge reactivity controlled HCCI combustion, and the works on HCCI fuelled with both high cetane number fuels, such as DME and n-heptane, and high octane number fuels, such as methanol, natural gas, gasoline and mixtures of gasoline/alcohols, are reviewed and discussed. Since single fuel cannot meet the reactivity requirements under different loads to control the combustion process, the studies related to concentration stratification and dual-fuel charge reactivity controlled HCCI combustion are then presented, which have been shown to have the potential to achieve effective combustion control. The efforts of using both mixture and thermal stratifications to achieve the auto-ignition and combustion control are also discussed. Thereafter, both charge reactivity and mixture stratification are then applied to control the combustion process. The potential and capability of thermal-atmosphere controlled compound combustion mode and dual-fuel reactivity controlled compression ignition (RCCI/highly premixed charge combustion (HPCC mode to achieve clean and high efficiency combustion are then presented and discussed. Based on these results and discussions, combustion mode design with high efficiency and low emissions controlled by fuel reactivity and mixtures stratification in the whole operating range is proposed.

  7. Design of experiments (DoE) in pharmaceutical development.

    Science.gov (United States)

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  8. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  9. Optimization of Soluble Expression and Purification of Recombinant Human Rhinovirus Type-14 3C Protease Using Statistically Designed Experiments: Isolation and Characterization of the Enzyme.

    Science.gov (United States)

    Antoniou, Georgia; Papakyriacou, Irineos; Papaneophytou, Christos

    2017-10-01

    Human rhinovirus (HRV) 3C protease is widely used in recombinant protein production for various applications such as biochemical characterization and structural biology projects to separate recombinant fusion proteins from their affinity tags in order to prevent interference between these tags and the target proteins. Herein, we report the optimization of expression and purification conditions of glutathione S-transferase (GST)-tagged HRV 3C protease by statistically designed experiments. Soluble expression of GST-HRV 3C protease was initially optimized by response surface methodology (RSM), and a 5.5-fold increase in enzyme yield was achieved. Subsequently, we developed a new incomplete factorial (IF) design that examines four variables (bacterial strain, expression temperature, induction time, and inducer concentration) in a single experiment. The new design called Incomplete Factorial-Strain/Temperature/Time/Inducer (IF-STTI) was validated using three GST-tagged proteins. In all cases, IF-STTI resulted in only 10% lower expression yields than those obtained by RSM. Purification of GST-HRV 3C was optimized by an IF design that examines simultaneously the effect of the amount of resin, incubation time of cell lysate with resin, and glycerol and DTT concentration in buffers, and a further 15% increase in protease recovery was achieved. Purified GST-HRV 3C protease was active at both 4 and 25 °C in a variety of buffers.

  10. Statistically designed optimisation of enzyme catalysed starch removal from potato pulp

    DEFF Research Database (Denmark)

    Thomassen, Lise Vestergaard; Meyer, Anne S.

    2010-01-01

    to obtain dietary fibers is usually accomplished via a three step, sequential enzymatic treatment procedure using a heat stable alpha-amylase, protease, and amyloglucosidase. Statistically designed experiments were performed to investigate the influence of enzyme dose, amount of dry matter, incubation time...... and temperature on the amount of starch released from the potato pulp. The data demonstrated that all the starch could be released from potato pulp in one step when 8% (w/w) dry potato pulp was treated with 0.2% (v/w) (enzyme/substrate (E/S)) of a thermostable Bacillus licheniformis alpha-amylase (Termamyl(R) SC...

  11. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  12. Optimization of poorly compactable drug tablets manufactured by direct compression using the mixture experimental design.

    Science.gov (United States)

    Martinello, Tiago; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Taqueda, Maria Elena Santos; Consiglieri, Vladi O

    2006-09-28

    The poor flowability and bad compressibility characteristics of paracetamol are well known. As a result, the production of paracetamol tablets is almost exclusively by wet granulation, a disadvantageous method when compared to direct compression. The development of a new tablet formulation is still based on a large number of experiments and often relies merely on the experience of the analyst. The purpose of this study was to apply experimental design methodology (DOE) to the development and optimization of tablet formulations containing high amounts of paracetamol (more than 70%) and manufactured by direct compression. Nineteen formulations, screened by DOE methodology, were produced with different proportions of Microcel 102, Kollydon VA 64, Flowlac, Kollydon CL 30, PEG 4000, Aerosil, and magnesium stearate. Tablet properties, except friability, were in accordance with the USP 28th ed. requirements. These results were used to generate plots for optimization, mainly for friability. The physical-chemical data found from the optimized formulation were very close to those from the regression analysis, demonstrating that the mixture project is a great tool for the research and development of new formulations.

  13. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    Science.gov (United States)

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  14. Use of demonstrations and experiments in teaching business statistics

    OpenAIRE

    Johnson, D. G.; John, J. A.

    2003-01-01

    The aim of a business statistics course should be to help students think statistically and to interpret and understand data, rather than to focus on mathematical detail and computation. To achieve this students must be thoroughly involved in the learning process, and encouraged to discover for themselves the meaning, importance and relevance of statistical concepts. In this paper we advocate the use of experiments and demonstrations as aids to achieving these goals. A number of demonstrations...

  15. Design of the forward straw tube tracker for the PANDA experiment

    Science.gov (United States)

    Smyrski, J.; Apostolou, A.; Biernat, J.; Czyżycki, W.; Filo, G.; Fioravanti, E.; Fiutowski, T.; Gianotti, P.; Idzik, M.; Korcyl, G.; Korcyl, K.; Lisowski, E.; Lisowski, F.; Płażek, J.; Przyborowski, D.; Przygoda, W.; Ritman, J.; Salabura, P.; Savrie, M.; Strzempek, P.; Swientek, K.; Wintz, P.; Wrońska, A.

    2017-06-01

    The design of the Forward Tracker for the Forward Spectrometer of the PANDA experiment is described. The tracker consists of 6 tracking stations, each comprising 4 planar double layers of straw tube detectors, and has a total material budget of only 2% X0. The straws are made self-supporting by a 1 bar over-pressure of the working gas mixture (Ar/CO2). This allows to use lightweight and compact rectangular support frames for the double layers and to split the frames into pairs of C-shaped half-frames for an easier installation on the beam line.

  16. DETERMINING A ROBUST D-OPTIMAL DESIGN FOR TESTING FOR DEPARTURE FROM ADDITIVITY IN A MIXTURE OF FOUR PFAAS

    Science.gov (United States)

    Our objective was to determine an optimal experimental design for a mixture of perfluoroalkyl acids (PFAAs) that is robust to the assumption of additivity. Of particular focus to this research project is whether an environmentally relevant mixture of four PFAAs with long half-liv...

  17. Modeling of asphalt-rubber rotational viscosity by statistical analysis and neural networks

    Directory of Open Access Journals (Sweden)

    Luciano Pivoto Specht

    2007-03-01

    Full Text Available It is of a great importance to know binders' viscosity in order to perform handling, mixing, application processes and asphalt mixes compaction in highway surfacing. This paper presents the results of viscosity measurement in asphalt-rubber binders prepared in laboratory. The binders were prepared varying the rubber content, rubber particle size, duration and temperature of mixture, all following a statistical design plan. The statistical analysis and artificial neural networks were used to create mathematical models for prediction of the binders viscosity. The comparison between experimental data and simulated results with the generated models showed best performance of the neural networks analysis in contrast to the statistic models. The results indicated that the rubber content and duration of mixture have major influence on the observed viscosity for the considered interval of parameters variation.

  18. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  19. Optimal Design and Related Areas in Optimization and Statistics

    CERN Document Server

    Pronzato, Luc

    2009-01-01

    This edited volume, dedicated to Henry P. Wynn, reflects his broad range of research interests, focusing in particular on the applications of optimal design theory in optimization and statistics. It covers algorithms for constructing optimal experimental designs, general gradient-type algorithms for convex optimization, majorization and stochastic ordering, algebraic statistics, Bayesian networks and nonlinear regression. Written by leading specialists in the field, each chapter contains a survey of the existing literature along with substantial new material. This work will appeal to both the

  20. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  1. A Mechanistic Design Approach for Graphite Nanoplatelet (GNP) Reinforced Asphalt Mixtures for Low-Temperature Applications

    Science.gov (United States)

    2018-01-01

    This report explores the application of a discrete computational model for predicting the fracture behavior of asphalt mixtures at low temperatures based on the results of simple laboratory experiments. In this discrete element model, coarse aggregat...

  2. Optimising mechanical strength and bulk density of dry ceramic bodies through mixture design

    Directory of Open Access Journals (Sweden)

    Correia, S. L.

    2005-02-01

    Full Text Available In industrial practice, it is desirable to be able to predict, in an expeditious way, what the effects of a change in raw materials or the proportions thereof might be in the various processing steps towards the final product. When the property of interest is basically determined by the combination (or mixture of raw materials, an optimisation methodology specific to the design of mixture experiments can be successfully used. In the present study, dry bending strength and bulk density were selected as the properties to model, given the simplicity of their experimental determination and because they are frequently used as quality control parameter in the development and manufacture stages of floor and wall ceramic tiles. Ten formulations of three raw materials (a clay mixture, potash feldspar and quartz sand were processed in the laboratory under fixed conditions, similar to those used in the ceramics industry, and characterised. The use of this methodology enabled the calculation of valid regression models (equations relating dry bending strength and bulk density with the contents, in the starting mixture, of the particular raw materials used.

    En el trabajo industrial es deseable poder predecir de manera efectiva, los efectos que los cambios en las materias primas o en sus proporciones pueden ejercer sobre las variables del proceso y como estos afectan al producto final. Cuando la propiedad de interés depende preferentemente de la mezcla de las materias primas, una metodología específica de optimización para el diseño de los experimentos de mezclas puede ser empleada con éxito. En este trabajo, la resistencia mecánica en seco y la densidad se emplearon como los parámetros de control en el desarrollo y producción de azulejos cerámicos para pavimento y revestimiento. Diez formulaciones a partir de tres materias primas ( una mezcla de arcilla, feldespato potásico y arena de cuarzo fueron procesadas en el laboratorio bajo

  3. Formulation optimization of transdermal meloxicam potassium-loaded mesomorphic phases containing ethanol, oleic acid and mixture surfactant using the statistical experimental design methodology.

    Science.gov (United States)

    Huang, Chi-Te; Tsai, Chia-Hsun; Tsou, Hsin-Yeh; Huang, Yaw-Bin; Tsai, Yi-Hung; Wu, Pao-Chu

    2011-01-01

    Response surface methodology (RSM) was used to develop and optimize the mesomorphic phase formulation for a meloxicam transdermal dosage form. A mixture design was applied to prepare formulations which consisted of three independent variables including oleic acid (X(1)), distilled water (X(2)) and ethanol (X(3)). The flux and lag time (LT) were selected as dependent variables. The result showed that using mesomorphic phases as vehicles can significantly increase flux and shorten LT of drug. The analysis of variance showed that the permeation parameters of meloxicam from formulations were significantly influenced by the independent variables and their interactions. The X(3) (ethanol) had the greatest potential influence on the flux and LT, followed by X(1) and X(2). A new formulation was prepared according to the independent levels provided by RSM. The observed responses were in close agreement with the predicted values, demonstrating that RSM could be successfully used to optimize mesomorphic phase formulations.

  4. Monitoring and optimizing the co-composting of dewatered sludge: a mixture experimental design approach.

    Science.gov (United States)

    Komilis, Dimitrios; Evangelou, Alexandros; Voudrias, Evangelos

    2011-09-01

    The management of dewatered wastewater sludge is a major issue worldwide. Sludge disposal to landfills is not sustainable and thus alternative treatment techniques are being sought. The objective of this work was to determine optimal mixing ratios of dewatered sludge with other organic amendments in order to maximize the degradability of the mixtures during composting. This objective was achieved using mixture experimental design principles. An additional objective was to study the impact of the initial C/N ratio and moisture contents on the co-composting process of dewatered sludge. The composting process was monitored through measurements of O(2) uptake rates, CO(2) evolution, temperature profile and solids reduction. Eight (8) runs were performed in 100 L insulated air-tight bioreactors under a dynamic air flow regime. The initial mixtures were prepared using dewatered wastewater sludge, mixed paper wastes, food wastes, tree branches and sawdust at various initial C/N ratios and moisture contents. According to empirical modeling, mixtures of sludge and food waste mixtures at 1:1 ratio (ww, wet weight) maximize degradability. Structural amendments should be maintained below 30% to reach thermophilic temperatures. The initial C/N ratio and initial moisture content of the mixture were not found to influence the decomposition process. The bio C/bio N ratio started from around 10, for all runs, decreased during the middle of the process and increased to up to 20 at the end of the process. The solid carbon reduction of the mixtures without the branches ranged from 28% to 62%, whilst solid N reductions ranged from 30% to 63%. Respiratory quotients had a decreasing trend throughout the composting process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Fuel rod design by statistical methods for MOX fuel

    International Nuclear Information System (INIS)

    Heins, L.; Landskron, H.

    2000-01-01

    Statistical methods in fuel rod design have received more and more attention during the last years. One of different possible ways to use statistical methods in fuel rod design can be described as follows: Monte Carlo calculations are performed using the fuel rod code CARO. For each run with CARO, the set of input data is modified: parameters describing the design of the fuel rod (geometrical data, density etc.) and modeling parameters are randomly selected according to their individual distributions. Power histories are varied systematically in a way that each power history of the relevant core management calculation is represented in the Monte Carlo calculations with equal frequency. The frequency distributions of the results as rod internal pressure and cladding strain which are generated by the Monte Carlo calculation are evaluated and compared with the design criteria. Up to now, this methodology has been applied to licensing calculations for PWRs and BWRs, UO 2 and MOX fuel, in 3 countries. Especially for the insertion of MOX fuel resulting in power histories with relatively high linear heat generation rates at higher burnup, the statistical methodology is an appropriate approach to demonstrate the compliance of licensing requirements. (author)

  6. Evaluation of factors that affect rutting resistance of asphalt mixes by orthogonal experiment design

    Directory of Open Access Journals (Sweden)

    Guilian Zou

    2017-05-01

    Full Text Available Rutting has been one of the major distresses observed on asphalt pavement in China, due to increasing traffic volume, heavy axle load, continuous hot weather, etc., especially in long-steep-slope section, bus stops, etc. Many factors would affect rutting resistance of asphalt pavement, including material properties, climatic condition, traffic volumes, speed, and axle types, and construction quality.The orthogonal experimental design method was used in this study to reduce the number of tests required, without comprising the validity of the test results. The testing variables and their levels were selected according to investigations and field test results. Effects of various factors on asphalt pavement rutting performance were evaluated, including the asphalt binders, mixture type (aggregate gradation, axle load, vehicle speed and temperature.In this study, the wheel tracking test was used to evaluate rutting performance, as represented by the parameter Dynamic Stability (DS, of the various asphalt mixes. Test results were analyzed using range analysis and analysis of variance (ANOVA. All four factors evaluated in this study had significant effects on pavement rutting performance. The ranking of the significance was asphalt mixture type, temperature, loading frequency, and tire-pavement contact pressure. Asphalt mixture type was the most important factor that affects rutting resistance. Within the asphalt mixtures, asphalt binder had significant effects on rutting performance of mixes more than aggregate gradation. Rutting resistance of SBS modified asphalt mixes was significantly better than neat asphalt mixes, and skeleton dense structure mixes were better than suspended dense structure mixes. Keywords: Asphalt mixes, Rutting resistance, Effect factor, Orthogonal experiment design

  7. Scalable Algorithms for Adaptive Statistical Designs

    Directory of Open Access Journals (Sweden)

    Robert Oehmke

    2000-01-01

    Full Text Available We present a scalable, high-performance solution to multidimensional recurrences that arise in adaptive statistical designs. Adaptive designs are an important class of learning algorithms for a stochastic environment, and we focus on the problem of optimally assigning patients to treatments in clinical trials. While adaptive designs have significant ethical and cost advantages, they are rarely utilized because of the complexity of optimizing and analyzing them. Computational challenges include massive memory requirements, few calculations per memory access, and multiply-nested loops with dynamic indices. We analyze the effects of various parallelization options, and while standard approaches do not work well, with effort an efficient, highly scalable program can be developed. This allows us to solve problems thousands of times more complex than those solved previously, which helps make adaptive designs practical. Further, our work applies to many other problems involving neighbor recurrences, such as generalized string matching.

  8. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  9. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  10. Determination of tolerance dose uncertainties and optimal design of dose response experiments with small animal numbers

    International Nuclear Information System (INIS)

    Karger, C.P.; Hartmann, G.H.

    2001-01-01

    Background: Dose response experiments aim to determine the complication probability as a function of dose. Adjusting the parameters of the frequently used dose response model P(D)=1/[1+(D 50 /D) k ] to the experimental data, 2 intuitive quantities are obtained: The tolerance dose D 50 and the slope parameter k. For mathematical reasons, however, standard statistic software uses a different set of parameters. Therefore, the resulting fit parameters of the statistic software as well as their standard errors have to be transformed to obtain D 50 and k as well as their standard errors. Material and Methods: The influence of the number of dose levels on the uncertainty of the fit parameters is studied by a simulation for a fixed number of animals. For experiments with small animal numbers, statistical artifacts may prevent the determination of the standard errors of the fit parameters. Consequences on the design of dose response experiments are investigated. Results: Explicit formulas are presented, which allow to calculate the parameters D 50 and k as well as their standard errors from the output of standard statistic software. The simulation shows, that the standard errors of the resulting parameters are independent of the number of dose levels, as long as the total number of animals involved in the experiment, remains constant. Conclusion: Statistical artifacts in experiments containing small animal numbers may be prevented by an adequate design of the experiment. For this, it is suggested to select a higher number of dose levels, rather than using a higher number of animals per dose level. (orig.) [de

  11. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    Science.gov (United States)

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  12. Mixtures of endocrine disrupting contaminants modelled on human high end exposures

    DEFF Research Database (Denmark)

    Christiansen, Sofie; Kortenkamp, A.; Petersen, Marta Axelstad

    2012-01-01

    exceeding 1 is expected to lead to effects in the rat, a total dose more than 62 times higher than human exposures should lead to responses. Considering the high uncertainty of this estimate, experience on lowest‐observed‐adverse‐effect‐level (LOAEL)/NOAEL ratios and statistical power of rat studies, we...... expected that combined doses 150 times higher than high end human intake estimates should give no, or only borderline effects, whereas doses 450 times higher should produce significant responses. Experiments indeed showed clear developmental toxicity of the 450‐fold dose in terms of increased nipple...... though each individual chemical is present at low, ineffective doses, but the effects of mixtures modelled based on human intakes have not previously been investigated. To address this issue for the first time, we selected 13 chemicals for a developmental mixture toxicity study in rats where data about...

  13. Design and initial characterization of the SC-200 proteomics standard mixture.

    Science.gov (United States)

    Bauman, Andrew; Higdon, Roger; Rapson, Sean; Loiue, Brenton; Hogan, Jason; Stacy, Robin; Napuli, Alberto; Guo, Wenjin; van Voorhis, Wesley; Roach, Jared; Lu, Vincent; Landorf, Elizabeth; Stewart, Elizabeth; Kolker, Natali; Collart, Frank; Myler, Peter; van Belle, Gerald; Kolker, Eugene

    2011-01-01

    High-throughput (HTP) proteomics studies generate large amounts of data. Interpretation of these data requires effective approaches to distinguish noise from biological signal, particularly as instrument and computational capacity increase and studies become more complex. Resolving this issue requires validated and reproducible methods and models, which in turn requires complex experimental and computational standards. The absence of appropriate standards and data sets for validating experimental and computational workflows hinders the development of HTP proteomics methods. Most protein standards are simple mixtures of proteins or peptides, or undercharacterized reference standards in which the identity and concentration of the constituent proteins is unknown. The Seattle Children's 200 (SC-200) proposed proteomics standard mixture is the next step toward developing realistic, fully characterized HTP proteomics standards. The SC-200 exhibits a unique modular design to extend its functionality, and consists of 200 proteins of known identities and molar concentrations from 6 microbial genomes, distributed into 10 molar concentration tiers spanning a 1,000-fold range. We describe the SC-200's design, potential uses, and initial characterization. We identified 84% of SC-200 proteins with an LTQ-Orbitrap and 65% with an LTQ-Velos (false discovery rate = 1% for both). There were obvious trends in success rate, sequence coverage, and spectral counts with protein concentration; however, protein identification, sequence coverage, and spectral counts vary greatly within concentration levels.

  14. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    Science.gov (United States)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  15. Study on the thermal ignition of gasoline-air mixture in underground oil depots based on experiment and numerical simulation

    Science.gov (United States)

    Ou, Yihong; Du, Yang; Jiang, Xingsheng; Wang, Dong; Liang, Jianjun

    2010-04-01

    The study on the special phenomenon, occurrence process and control mechanism of gasoline-air mixture thermal ignition in underground oil depots is of important academic and applied value for enriching scientific theories of explosion safety, developing protective technology against fire and decreasing the number of fire accidents. In this paper, the research on thermal ignition process of gasoline-air mixture in model underground oil depots tunnel has been carried out by using experiment and numerical simulation methods. The calculation result has been demonstrated by the experiment data. The five stages of thermal ignition course, which are slow oxidation stage, rapid oxidation stage, fire stage, flameout stage and quench stage, have been firstly defined and accurately descried. According to the magnitude order of concentration, the species have been divided into six categories, which lay the foundation for explosion-proof design based on the role of different species. The influence of space scale on thermal ignition in small-scale space has been found, and the mechanism for not easy to fire is that the wall reflection causes the reflux of fluids and changes the distribution of heat and mass, so that the progress of chemical reactions in the whole space are also changed. The novel mathematical model on the basis of unification chemical kinetics and thermodynamics established in this paper provides supplementary means for the analysis of process and mechanism of thermal ignition.

  16. Modular design of metabolic network for robust production of n-butanol from galactose-glucose mixtures.

    Science.gov (United States)

    Lim, Hyun Gyu; Lim, Jae Hyung; Jung, Gyoo Yeol

    2015-01-01

    Refactoring microorganisms for efficient production of advanced biofuel such as n-butanol from a mixture of sugars in the cheap feedstock is a prerequisite to achieve economic feasibility in biorefinery. However, production of biofuel from inedible and cheap feedstock is highly challenging due to the slower utilization of biomass-driven sugars, arising from complex assimilation pathway, difficulties in amplification of biosynthetic pathways for heterologous metabolite, and redox imbalance caused by consuming intracellular reducing power to produce quite reduced biofuel. Even with these problems, the microorganisms should show robust production of biofuel to obtain industrial feasibility. Thus, refactoring microorganisms for efficient conversion is highly desirable in biofuel production. In this study, we engineered robust Escherichia coli to accomplish high production of n-butanol from galactose-glucose mixtures via the design of modular pathway, an efficient and systematic way, to reconstruct the entire metabolic pathway with many target genes. Three modular pathways designed using the predictable genetic elements were assembled for efficient galactose utilization, n-butanol production, and redox re-balancing to robustly produce n-butanol from a sugar mixture of galactose and glucose. Specifically, the engineered strain showed dramatically increased n-butanol production (3.3-fold increased to 6.2 g/L after 48-h fermentation) compared to the parental strain (1.9 g/L) in galactose-supplemented medium. Moreover, fermentation with mixtures of galactose and glucose at various ratios from 2:1 to 1:2 confirmed that our engineered strain was able to robustly produce n-butanol regardless of sugar composition with simultaneous utilization of galactose and glucose. Collectively, modular pathway engineering of metabolic network can be an effective approach in strain development for optimal biofuel production with cost-effective fermentable sugars. To the best of our

  17. Optimization of natural lipstick formulation based on pitaya (Hylocereus polyrhizus) seed oil using D-optimal mixture experimental design.

    Science.gov (United States)

    Kamairudin, Norsuhaili; Gani, Siti Salwa Abd; Masoumi, Hamid Reza Fard; Hashim, Puziah

    2014-10-16

    The D-optimal mixture experimental design was employed to optimize the melting point of natural lipstick based on pitaya (Hylocereus polyrhizus) seed oil. The influence of the main lipstick components-pitaya seed oil (10%-25% w/w), virgin coconut oil (25%-45% w/w), beeswax (5%-25% w/w), candelilla wax (1%-5% w/w) and carnauba wax (1%-5% w/w)-were investigated with respect to the melting point properties of the lipstick formulation. The D-optimal mixture experimental design was applied to optimize the properties of lipstick by focusing on the melting point with respect to the above influencing components. The D-optimal mixture design analysis showed that the variation in the response (melting point) could be depicted as a quadratic function of the main components of the lipstick. The best combination of each significant factor determined by the D-optimal mixture design was established to be pitaya seed oil (25% w/w), virgin coconut oil (37% w/w), beeswax (17% w/w), candelilla wax (2% w/w) and carnauba wax (2% w/w). With respect to these factors, the 46.0 °C melting point property was observed experimentally, similar to the theoretical prediction of 46.5 °C. Carnauba wax is the most influential factor on this response (melting point) with its function being with respect to heat endurance. The quadratic polynomial model sufficiently fit the experimental data.

  18. Optimization of Natural Lipstick Formulation Based on Pitaya (Hylocereus polyrhizus Seed Oil Using D-Optimal Mixture Experimental Design

    Directory of Open Access Journals (Sweden)

    Norsuhaili Kamairudin

    2014-10-01

    Full Text Available The D-optimal mixture experimental design was employed to optimize the melting point of natural lipstick based on pitaya (Hylocereus polyrhizus seed oil. The influence of the main lipstick components—pitaya seed oil (10%–25% w/w, virgin coconut oil (25%–45% w/w, beeswax (5%–25% w/w, candelilla wax (1%–5% w/w and carnauba wax (1%–5% w/w—were investigated with respect to the melting point properties of the lipstick formulation. The D-optimal mixture experimental design was applied to optimize the properties of lipstick by focusing on the melting point with respect to the above influencing components. The D-optimal mixture design analysis showed that the variation in the response (melting point could be depicted as a quadratic function of the main components of the lipstick. The best combination of each significant factor determined by the D-optimal mixture design was established to be pitaya seed oil (25% w/w, virgin coconut oil (37% w/w, beeswax (17% w/w, candelilla wax (2% w/w and carnauba wax (2% w/w. With respect to these factors, the 46.0 °C melting point property was observed experimentally, similar to the theoretical prediction of 46.5 °C. Carnauba wax is the most influential factor on this response (melting point with its function being with respect to heat endurance. The quadratic polynomial model sufficiently fit the experimental data.

  19. Experimental study of hydrocarbon mixtures to replace HFC-134a in a domestic refrigerator

    International Nuclear Information System (INIS)

    Wongwises, Somchai; Chimres, Nares

    2005-01-01

    This work presents an experimental study on the application of hydrocarbon mixtures to replace HFC-134a in a domestic refrigerator. The hydrocarbons investigated are propane (R290), butane (R600) and isobutane (R600a). A refrigerator designed to work with HFC-134a with a gross capacity of 239 l is used in the experiment. The consumed energy, compressor power and refrigerant temperature and pressure at the inlet and outlet of the compressor are recorded and analysed as well as the distributions of temperature at various positions in the refrigerator. The refrigerant mixtures used are divided into three groups: the mixture of three hydrocarbons, the mixture of two hydrocarbons and the mixture of two hydrocarbons and HFC-134a. The experiments are conducted with the refrigerants under the same no load condition at a surrounding temperature of 25 deg. C. The results show that propane/butane 60%/40% is the most appropriate alternative refrigerant to HFC-134a

  20. An investigation of the opacity of high-Z mixture and implications for inertial confinement fusion hohlraum design

    International Nuclear Information System (INIS)

    Wang, P.; MacFarlane, J.J.; Orzechowski, T.J.

    1997-01-01

    We use an unresolved transition array model to investigate the opacities of high-Z materials and their mixtures which are of interest to indirect-drive inertial confinement fusion hohlraum design. In particular, we report on calculated opacities for pure Au, Gd, and Sm, as well as Au endash Sm and Au endash Gd mixtures. Our results indicate that mixtures of Au endash Gd and Au endash Sm can produce a significant enhancement in the Rosseland mean opacity. Radiation hydrodynamics simulations of Au radiation burnthrough are also presented, and compared with NOVA experimental data. copyright 1997 American Institute of Physics

  1. Assessment of competition and yield advantage in addition series of barley variety mixtures

    Directory of Open Access Journals (Sweden)

    Kari Jokinen

    1991-09-01

    Full Text Available In an addition series experiment the competition between three barley varieties (Agneta, Arra and Porno and the yield performance of mixtures were evaluated. Also two levels of nitrogen fertilization (50 and 100 kgN/ha were applied. Two approaches (the replacement series and the linear regression equation were used to analyse the competitive relationship based on grain yields in two-component mixtures. In three component mixtures the replacement series approach was applied. Both methods showed a similar dominance order of the varieties with Arra always being dominant and Agneta subordinate. The relationship between varieties was independent of the number of varieties in the mixture. Increase in available nitrogen strengthened the competitiveness of Arra especially in the dense, two-variety mixtures. Some mixtures over yielded but the differences were not statistically significant. The yield advantage based on relative yield total or on the ratio of actual and expected yield was greatest when the density and nitrogen fertilization were low and especially when one component in the mixture was a rather low yielding variety (Agneta. The land equivalent ratios (LER (the reference pure culture yield was the maximum yield of each variety were close to one, suggesting that under optimal growing conditions the yield advantage of barley varietal mixtures is marginal.

  2. The Reliability of Single Subject Statistics for Biofeedback Studies.

    Science.gov (United States)

    Bremner, Frederick J.; And Others

    To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…

  3. Deriving guidelines for the design of plate evaporators in heat pumps using zeotropic mixtures

    DEFF Research Database (Denmark)

    Mancini, Roberta; Zühlsdorf, Benjamin; Jensen, Jonas Kjær

    2018-01-01

    This paper presents a derivation of design guidelines for plate heat exchangers used for evaporation of zeotropic mixtures in heat pumps. A mapping of combined heat exchanger and cycle calculations for different combinations of geometrical parameters and working fluids allowed estimating the trade....... It was found that the pressure drop limit leading to infeasible designs was dependent on the working fluid, thereby making it impossible to define a guideline based on maximum allowable pressure drops. It was found that economically feasible designs could be obtained by correlating the vapour Reynolds number...

  4. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    International Nuclear Information System (INIS)

    Abbrescia, M.; Muhammad, S.; Saviano, G.; Auwegem, P. Van; Cauwenbergh, S.; Tytgat, M.; Benussi, L.; Bianco, S.; Passamonti, L.; Pierluigi, D.; Piccolo, D.; Primavera, F.; Russo, A.; Ferrini, M.

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require Fluorine based (F-based) gases for optimal performance. Recent European regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade, several tests are ongoing to measure the performance of the detector with these new ecological gas mixtures, in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard CMS electronic setup are under test. In this paper preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze, with CO 2 and CF 3 I based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  5. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    CERN Document Server

    Abbrescia, M.

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require Fluorine based (F-based) gases for optimal performance. Recent European regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade, several tests are ongoing to measure the performance of the detector with these new ecological gas mixtures, in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard CMS electronic setup are under test. In this paper preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze, with CO2 and CF3I based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  6. Cost Optimal System Identification Experiment Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    A structural system identification experiment design method is formulated in the light of decision theory, structural reliability theory and optimization theory. The experiment design is based on a preposterior analysis, well-known from the classical decision theory. I.e. the decisions concerning...... reflecting the cost of the experiment and the value of obtained additional information. An example concerning design of an experiment for parametric identification of a single degree of freedom structural system shows the applicability of the experiment design method....... the experiment design are not based on obtained experimental data. Instead the decisions are based on the expected experimental data assumed to be obtained from the measurements, estimated based on prior information and engineering judgement. The design method provides a system identification experiment design...

  7. Characterization of Magnetic Field Immersed Photomultipliers from Double Chooz Experiment. Design and Construction of their Magnetic Shields

    International Nuclear Information System (INIS)

    Valdivia Valero, F. J.

    2007-01-01

    Flavour oscillations of neutrinos are a quantum-mechanical effect widely demonstrated. It is explained through interferences of their mass eigenstates, therefore, belonging to the physical area beyond the Standard Model. This work deals with the CIEMAT collaboration in the neutrino experiment Double Chooz. Such an experiment aims to measure the mixture angle θ 1 3, one of the PMNS leptonic mixture matrix, with a un reached-before sensibility by decrease of systematic errors. For this, two identical scintillator detectors, equipped with PMT's, will be sited at different distances from two reactors located in the nuclear power plant CHOOZ B (France). The electronic neutrino flux from these reactors will be compared, explaining its deficit by flavour oscillations of these particles. The identity of both detectors will be diminished by the magnetic field effects on the PMT's response. Therefore, this study serves as for quantifying such an effects as for fitting the magnetic shields design that minimize them. Shielding measurements and final design of magnetic shields as much as the effect these ones cause in the PMT's response immersed in a monitored magnetic field are presented. (Author) 85 refs

  8. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.

  9. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  10. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  11. Mixtures Estimation and Applications

    CERN Document Server

    Mengersen, Kerrie; Titterington, Mike

    2011-01-01

    This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subject

  12. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  13. Influence of special attributes of zeotropic refrigerant mixtures on design and operation of vapour compression refrigeration and heat pump systems

    International Nuclear Information System (INIS)

    Rajapaksha, Leelananda

    2007-01-01

    The use of zeotropic refrigerant mixtures introduces a number of novel issues that affect the established design and operational practices of vapour compression systems used in refrigeration, air conditioning and heat pump applications. Two attributes; composition shift and temperature glide, associated with the phase changing process of zeotropic mixtures are the primary phenomena that bring in these issues. However, relevant researches are uncovering ways how careful system designs and selection of operational parameters allow improving the energy efficiency and the capacity of vapour compression refrigeration systems. Most of these concepts exploit the presence of composition shift and temperature glide. This paper qualitatively discusses how the mixture attributes influence the established heat exchanger design practices, performance and operation of conventional vapour compression systems. How the temperature glide and composition shift can be incorporated to improve the system performance and the efficiency are also discussed

  14. Biocompatible Nanoemulsions for Improved Aceclofenac Skin Delivery: Formulation Approach Using Combined Mixture-Process Experimental Design.

    Science.gov (United States)

    Isailović, Tanja; Ðorđević, Sanela; Marković, Bojan; Ranđelović, Danijela; Cekić, Nebojša; Lukić, Milica; Pantelić, Ivana; Daniels, Rolf; Savić, Snežana

    2016-01-01

    We aimed to develop lecithin-based nanoemulsions intended for effective aceclofenac (ACF) skin delivery utilizing sucrose esters [sucrose palmitate (SP) and sucrose stearate (SS)] as additional stabilizers and penetration enhancers. To find the suitable surfactant mixtures and levels of process variables (homogenization pressure and number of cycles - high pressure homogenization manufacturing method) that result in drug-loaded nanoemulsions with minimal droplet size and narrow size distribution, a combined mixture-process experimental design was employed. Based on optimization data, selected nanoemulsions were evaluated regarding morphology, surface charge, drug-excipient interactions, physical stability, and in vivo skin performances (skin penetration and irritation potential). The predicted physicochemical properties and storage stability were proved satisfying for ACF-loaded nanoemulsions containing 2% of SP in the blend with 0%-1% of SS and 1%-2% of egg lecithin (produced at 50°C/20 cycles/800 bar). Additionally, the in vivo tape stripping demonstrated superior ACF skin absorption from these nanoemulsions, particularly from those containing 2% of SP, 0.5% of SS, and 1.5% of egg lecithin, when comparing with the sample costabilized by conventional surfactant - polysorbate 80. In summary, the combined mixture-process experimental design was shown as a feasible tool for formulation development of multisurfactant-based nanosized delivery systems with potentially improved overall product performances.

  15. Statistical Engineering in Air Traffic Management Research

    Science.gov (United States)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  16. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  17. Densities and viscosities of the mixtures (formamide + 2-alkanol): Experimental and theoretical approaches

    International Nuclear Information System (INIS)

    Almasi, Mohammad

    2014-01-01

    Graphical abstract: Viscosity deviations △η vs. mole fraction of FA, for binary mixtures of FA with (□) 2-PrOH, (●) 2-BuOH, (■) 2-PenOH, (◀) 2-HexOH, (◊) 2-HepOH at T = 298.15 K. The solid curves were calculated from Redlich–Kister type equation. -- Highlights: • Densities and viscosities of the mixtures (formamide + 2-alkanols) were measured. • Experiments were performed over the entire mole fraction at four temperatures. • SAFT and PC-SAFT were applied to predict the volumetric behavior of mixtures. • PRSV equation of state (EOS) has been used to predict the binary viscosities. -- Abstract: Densities and viscosities of binary liquid mixtures of formamide (FA) with polar solvents namely, 2-PrOH, 2-BuOH, 2-PenOH, 2-HexOH, and 2-HepOH, have been measured as a function of composition range at temperatures (298.15, 303.15, 308.15, 313.15) K and ambient pressure. From experimental data, excess molar volumes, V m E and viscosity deviations Δη, were calculated and correlated by Redlich–Kister type function. The effect of temperature and chain-length of the 2-alkanols on the excess molar volumes and viscosity deviations are discussed in terms of molecular interaction between unlike molecules. The statistical associating fluid theory (SAFT), and perturbed chain statistical associating fluid theory (PC-SAFT) were applied to correlate and predict the volumetric behavior of the mixtures. The best predictions were achieved with the PC-SAFT equation of state. Also the Peng–Robinson–Stryjek–Vera equation of state has been used to predict the viscosity of binary mixtures

  18. Prospective Power Calculations for the Four Lab Study of A Multigenerational Reproductive/Developmental Toxicity Rodent Bioassay Using A Complex Mixture of Disinfection By-Products in the Low-Response Region

    Directory of Open Access Journals (Sweden)

    Jane Ellen Simmons

    2011-10-01

    Full Text Available In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA’s Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90% to detect pup weight decreases, while providing the most power to detect increased prenatal loss.

  19. Prospective Power Calculations for the Four Lab Study of A Multigenerational Reproductive/Developmental Toxicity Rodent Bioassay Using A Complex Mixture of Disinfection By-Products in the Low-Response Region

    Science.gov (United States)

    Dingus, Cheryl A.; Teuschler, Linda K.; Rice, Glenn E.; Simmons, Jane Ellen; Narotsky, Michael G.

    2011-01-01

    In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA’s Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss. PMID:22073030

  20. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  1. Oral bioavailability enhancement of raloxifene by developing microemulsion using D-optimal mixture design: optimization and in-vivo pharmacokinetic study.

    Science.gov (United States)

    Shah, Nirmal; Seth, Avinashkumar; Balaraman, R; Sailor, Girish; Javia, Ankur; Gohil, Dipti

    2018-04-01

    The objective of this work was to utilize a potential of microemulsion for the improvement in oral bioavailability of raloxifene hydrochloride, a BCS class-II drug with 2% bioavailability. Drug-loaded microemulsion was prepared by water titration method using Capmul MCM C8, Tween 20, and Polyethylene glycol 400 as oil, surfactant, and co-surfactant respectively. The pseudo-ternary phase diagram was constructed between oil and surfactants mixture to obtain appropriate components and their concentration ranges that result in large existence area of microemulsion. D-optimal mixture design was utilized as a statistical tool for optimization of microemulsion considering oil, S mix , and water as independent variables with percentage transmittance and globule size as dependent variables. The optimized formulation showed 100 ± 0.1% transmittance and 17.85 ± 2.78 nm globule size which was identically equal with the predicted values of dependent variables given by the design expert software. The optimized microemulsion showed pronounced enhancement in release rate compared to plain drug suspension following diffusion controlled release mechanism by the Higuchi model. The formulation showed zeta potential of value -5.88 ± 1.14 mV that imparts good stability to drug loaded microemulsion dispersion. Surface morphology study with transmission electron microscope showed discrete spherical nano sized globules with smooth surface. In-vivo pharmacokinetic study of optimized microemulsion formulation in Wistar rats showed 4.29-fold enhancements in bioavailability. Stability study showed adequate results for various parameters checked up to six months. These results reveal the potential of microemulsion for significant improvement in oral bioavailability of poorly soluble raloxifene hydrochloride.

  2. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  3. Application of Taguchi methods to dual mixture ratio propulsion system optimization for SSTO vehicles

    Science.gov (United States)

    Stanley, Douglas O.; Unal, Resit; Joyner, C. R.

    1992-01-01

    The application of advanced technologies to future launch vehicle designs would allow the introduction of a rocket-powered, single-stage-to-orbit (SSTO) launch system early in the next century. For a selected SSTO concept, a dual mixture ratio, staged combustion cycle engine that employs a number of innovative technologies was selected as the baseline propulsion system. A series of parametric trade studies are presented to optimize both a dual mixture ratio engine and a single mixture ratio engine of similar design and technology level. The effect of varying lift-off thrust-to-weight ratio, engine mode transition Mach number, mixture ratios, area ratios, and chamber pressure values on overall vehicle weight is examined. The sensitivity of the advanced SSTO vehicle to variations in each of these parameters is presented, taking into account the interaction of each of the parameters with each other. This parametric optimization and sensitivity study employs a Taguchi design method. The Taguchi method is an efficient approach for determining near-optimum design parameters using orthogonal matrices from design of experiments (DOE) theory. Using orthogonal matrices significantly reduces the number of experimental configurations to be studied. The effectiveness and limitations of the Taguchi method for propulsion/vehicle optimization studies as compared to traditional single-variable parametric trade studies is also discussed.

  4. Multichannel microformulators for massively parallel machine learning and automated design of biological experiments

    Science.gov (United States)

    Wikswo, John; Kolli, Aditya; Shankaran, Harish; Wagoner, Matthew; Mettetal, Jerome; Reiserer, Ronald; Gerken, Gregory; Britt, Clayton; Schaffer, David

    Genetic, proteomic, and metabolic networks describing biological signaling can have 102 to 103 nodes. Transcriptomics and mass spectrometry can quantify 104 different dynamical experimental variables recorded from in vitro experiments with a time resolution approaching 1 s. It is difficult to infer metabolic and signaling models from such massive data sets, and it is unlikely that causality can be determined simply from observed temporal correlations. There is a need to design and apply specific system perturbations, which will be difficult to perform manually with 10 to 102 externally controlled variables. Machine learning and optimal experimental design can select an experiment that best discriminates between multiple conflicting models, but a remaining problem is to control in real time multiple variables in the form of concentrations of growth factors, toxins, nutrients and other signaling molecules. With time-division multiplexing, a microfluidic MicroFormulator (μF) can create in real time complex mixtures of reagents in volumes suitable for biological experiments. Initial 96-channel μF implementations control the exposure profile of cells in a 96-well plate to different temporal profiles of drugs; future experiments will include challenge compounds. Funded in part by AstraZeneca, NIH/NCATS HHSN271201600009C and UH3TR000491, and VIIBRE.

  5. Application of mechanistic empirical approach to predict rutting of superpave mixtures in Iraq

    Directory of Open Access Journals (Sweden)

    Qasim Zaynab

    2018-01-01

    Full Text Available In Iraq rutting is considered as a real distress in flexible pavements as a result of high summer temperature, and increased axle loads. This distress majorly affects asphalt pavement performance, lessens the pavement useful service life and makes serious hazards for highway users. Performance of HMA mixtures against rutting using Mechanistic- Empirical approach is predicted by considering Wheel-Tracking test and employing the Superpave mix design requirements. Roller Wheel Compactor has been locally manufactured to prepare slab specimens. In view of study laboratory outcomes that are judged to be simulative of field loading conditions, models are developed for predicting permanent strain of compacted samples of local asphalt concrete mixtures after considering the stress level, properties of local material and environmental impacts variables. All in all, laboratory results were produced utilizing statistical analysis with the aid of SPSS software. Permanent strain models for asphalt concrete mixtures were developed as a function of: number of passes, temperature, asphalt content, viscosity, air voids and additive content. Mechanistic Empirical design approach through the MnPAVE software was applied to characterize rutting in HMA and to predict allowable number of loading repetitions of mixtures as a function of expected traffic loads, material properties, and environmental temperature.

  6. DMA Friends: the mobilization of statistics in a media innovation experiment in the museum sector

    Directory of Open Access Journals (Sweden)

    GERMAN Ronan

    2016-07-01

    Full Text Available On January 23, 2013, the Dallas Museum of Art (DMA returned to free general admission. This announcement coincided with the official launching of two programs: DMA Friends, a loyalty program, and DMA Partners, a membership program which is free of charge. The aim of this article is to propose an analysis, at the crossroads of media semiology, of political science and the historical sociology of statistical rationality, with a view to studying the ways in which the DMA Friends program designers have mobilized the statistical argument to justify the soundness of their approach. In order to do so, they leaned on a range of media forms generated by a sophisticated techno-semiotic apparatus which represents, in a statistical form, the behavior of visitors inside the museum. The program (and the whole instrumentation that sustains it illustrates a media innovation experiment in the museum sector that questions the ways by which the statistical work is mediated according to the communicational situations in which it is mobilized and enhanced.

  7. Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.

    Science.gov (United States)

    Potter, Christine E; Wang, Tianlin; Saffran, Jenny R

    2017-04-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.

  8. Mine-by experiment final design report

    International Nuclear Information System (INIS)

    Read, R.S.; Martin, C.D.

    1991-12-01

    The Underground Research Laboratory (URL) Mine-by Experiment is designed to provide information on rock mass response to excavation that will be used to assess important aspects of the design of a nuclear fuel waste disposal vault in a granitic pluton. The final experiment design is the result of a multidisciplinary approach, drawing on experience gained at other sites as well as the URL, and using both internal expertise and the external consultants. The final experiment design, including details on characterization, construction, instrumentation, and numerical modelling, is presented along with final design drawings

  9. Evaluation of statistical designs in phase I expansion cohorts: the Dana-Farber/Harvard Cancer Center experience.

    Science.gov (United States)

    Dahlberg, Suzanne E; Shapiro, Geoffrey I; Clark, Jeffrey W; Johnson, Bruce E

    2014-07-01

    Phase I trials have traditionally been designed to assess toxicity and establish phase II doses with dose-finding studies and expansion cohorts but are frequently exceeding the traditional sample size to further assess endpoints in specific patient subsets. The scientific objectives of phase I expansion cohorts and their evolving role in the current era of targeted therapies have yet to be systematically examined. Adult therapeutic phase I trials opened within Dana-Farber/Harvard Cancer Center (DF/HCC) from 1988 to 2012 were identified for sample size details. Statistical designs and study objectives of those submitted in 2011 were reviewed for expansion cohort details. Five hundred twenty-two adult therapeutic phase I trials were identified during the 25 years. The average sample size of a phase I study has increased from 33.8 patients to 73.1 patients over that time. The proportion of trials with planned enrollment of 50 or fewer patients dropped from 93.0% during the time period 1988 to 1992 to 46.0% between 2008 and 2012; at the same time, the proportion of trials enrolling 51 to 100 patients and more than 100 patients increased from 5.3% and 1.8%, respectively, to 40.5% and 13.5% (χ(2) test, two-sided P < .001). Sixteen of the 60 trials (26.7%) in 2011 enrolled patients to three or more sub-cohorts in the expansion phase. Sixty percent of studies provided no statistical justification of the sample size, although 91.7% of trials stated response as an objective. Our data suggest that phase I studies have dramatically changed in size and scientific scope within the last decade. Additional studies addressing the implications of this trend on research processes, ethical concerns, and resource burden are needed. © The Author 2014. Published by Oxford University Press. All rights reserved.

  10. Performance of an organic Rankine cycle with multicomponent mixtures

    International Nuclear Information System (INIS)

    Chaitanya Prasad, G.S.; Suresh Kumar, C.; Srinivasa Murthy, S.; Venkatarathnam, G.

    2015-01-01

    There is a renewed interest in ORC (organic Rankine cycle) systems for power generation using solar thermal energy. Many authors have studied the performance of ORC with different pure fluids as well as binary zeotropic mixtures in order to improve the thermal efficiency. It has not been well appreciated that zeotropic mixtures can also be used to reduce the size and cost of an ORC system. The main objective of this paper is to present mixtures that help reduce the cost while maintaining high thermal efficiency. The proposed method also allows us to design an optimum mixture for a given expander. This new approach is particularly beneficial for designing mixtures for small ORC systems operating with solar thermal energy. A number of examples are presented to demonstrate this concept. - Highlights: • The performance of an ORC operating with different zeotropic multicomponent mixtures is presented. • A thermodynamic method is proposed for the design of multicomponent mixtures for ORC power plants. • High exergy efficiency as well as high volumetric expander work can be achieved with appropriate mixtures. • The method allows design of mixtures that can be used with off-the-shelf positive displacement expanders

  11. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue.

    Science.gov (United States)

    Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F

    2011-05-20

    Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.

  12. Model-based experimental design for assessing effects of mixtures of chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Baas, Jan, E-mail: jan.baas@falw.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands); Stefanowicz, Anna M., E-mail: anna.stefanowicz@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Klimek, Beata, E-mail: beata.klimek@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Laskowski, Ryszard, E-mail: ryszard.laskowski@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Kooijman, Sebastiaan A.L.M., E-mail: bas@bio.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands)

    2010-01-15

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  13. Model-based experimental design for assessing effects of mixtures of chemicals

    International Nuclear Information System (INIS)

    Baas, Jan; Stefanowicz, Anna M.; Klimek, Beata; Laskowski, Ryszard; Kooijman, Sebastiaan A.L.M.

    2010-01-01

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  14. Improvement of a Mixture Experiment Model Relating the Component Proportions to the Size of Nanonized Itraconazole Particles in Extemporary Suspensions

    Energy Technology Data Exchange (ETDEWEB)

    Pattarino, Franco; Piepel, Gregory F.; Rinaldi, Maurizio

    2018-05-01

    The Foglio Bonda et al. (2016) (henceforth FB) paper discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (the response variable of interest). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). After publication of the FB paper, the second author of this corrigendum (not an author of the original paper) contacted the corresponding author to point out some errors as well as insufficient explanations in parts of the paper. This corrigendum was prepared to address these issues. The authors of the original paper apologize for any inconveniences to readers.

  15. Statistical imitation system using relational interest points and Gaussian mixture models

    CSIR Research Space (South Africa)

    Claassens, J

    2009-11-01

    Full Text Available The author proposes an imitation system that uses relational interest points (RIPs) and Gaussian mixture models (GMMs) to characterize a behaviour. The system's structure is inspired by the Robot Programming by Demonstration (RDP) paradigm...

  16. Smashing UX design foundations for designing online user experiences

    CERN Document Server

    Allen, Jesmond

    2012-01-01

    The ultimate guide to UX from the world's most popular resource for web designers and developers Smashing Magazine is the world's most popular resource for web designers and developers and with this book, the authors provide the pinnacle resource to becoming savvy with User Experience Design (UX). The authors first provide an overview of UX and chart its rise to becoming a valuable and necessary practice for narrowing the gap between Web sites, applications, and users in order to make a user's experience a happy, easy, and successful one.Examines the essential aspects of User Experience Design

  17. Long-term development of nursing mixtures of Sitka spruce and larch species in an experiment in northern Scotland

    Directory of Open Access Journals (Sweden)

    William L. Mason

    2014-12-01

    Full Text Available Aim of the study: An experiment was established in 1966 to compare the growth and development of 50: 50 mixtures of Sitka spruce (Picea sitchensis with either Japanese larch (Larix kaempferi or tamarack (L. laricina with that found in pure plots of Sitka spruce. The site was one of moderate nitrogen availability where the presence of heather (Calluna vulgaris could be expected to limit the growth ofSitka spruce.Area of the study: North-east Scotland.Material and methods: There were different patterns of spruce growth in the pure plots and in the mixtures, with faster spruce growth in mixture in the years approaching and immediately following canopy closure (i.e. ages 15-25. Foliage analysis suggested that this was linked with improved nitrogen status of spruce trees in the mixed compared to the pure plots.Main results: At years 20 and 25 there were significant differences in height, diameter, and basal area between treatments, with the largest basal area being found in the Japanese larch/Sitka spruce mixtures, indicative of overyielding in the mixed plots. However, when the experiment was clearfelled at 41 years of age, all treatments had self-thinned to produce spruce dominated stands of similar height with only an occasional larch tree surviving in plots that were originally 50:50 mixtures.Research highlights: There were no differences between treatments in basal area, harvested volume or sawlog outturn after 41 years. These results can be interpreted as showing facilitation between the larch and the spruce during the establishment phase followed by competition for light once canopy closure had occurred.Keywords: Mixed stand dynamics; facilitation; nitrogen status; product outturn.

  18. Identifying main factors of capacity fading in lithium ion cells using orthogonal design of experiments

    International Nuclear Information System (INIS)

    Su, Laisuo; Zhang, Jianbo; Wang, Caijuan; Zhang, Yakun; Li, Zhe; Song, Yang; Jin, Ting; Ma, Zhao

    2016-01-01

    Highlights: • The effect of seven principal factors on the aging behavior of lithium ion cells is studied. • Orthogonal design of experiments is used to reduce the experiment units. • Capacity fades linearly during the initial 10% capacity fading period. • Statistical methods are used to compare the significance of each principal factor. • A multi-factor statistical model is developed to predict the aging rate of cells. - Abstract: The aging rate under cycling conditions for lithium-ion cells is affected by many factors. Seven principal factors are systematically examined using orthogonal design of experiments, and statistical analysis was used to identify the order of principal factors in terms of strength in causing capacity fade. These seven principal factors are: the charge and discharge currents (i_1, i_2) during the constant current regime, the charge and discharge cut-off voltages (V_1, V_2) and the corresponding durations (t_1, t_2) during the constant voltage regime, and the ambient temperature (T). An orthogonal array with 18 test units was selected for the experiments. The test results show that (1) during the initial 10% capacity fading period, the capacity faded linearly with Wh-throughput for all the test conditions; (2) after the initial period, certain cycling conditions exacerbated aging rates, while the others remain the same. The statistical results show that: (1) except for t_1, the other six principal factors significantly affect the aging rate; (2) the strength of the principal factors was ranked as: i_1 > V_1 > T > t_2 > V_2 > i_2 > t_1. Finally, a multi-factor statistical aging model is developed to predict the aging rate, and the accuracy of the model is validated.

  19. Four Papers on Contemporary Software Design Strategies for Statistical Methodologists

    OpenAIRE

    Carey, Vincent; Cook, Dianne

    2014-01-01

    Software design impacts much of statistical analysis and, as technology changes, dramatically so in recent years, it is exciting to learn how statistical software is adapting and changing. This leads to the collection of papers published here, written by John Chambers, Duncan Temple Lang, Michael Lawrence, Martin Morgan, Yihui Xie, Heike Hofmann and Xiaoyue Cheng.

  20. Emergence of life from multicomponent mixtures of chemicals: the case for experiments with cycling physicochemical gradients.

    Science.gov (United States)

    Spitzer, Jan

    2013-04-01

    The emergence of life from planetary multicomponent mixtures of chemicals is arguably the most complicated and least understood natural phenomenon. The fact that living cells are non-equilibrium systems suggests that life can emerge only from non-equilibrium chemical systems. From an astrobiological standpoint, non-equilibrium chemical systems arise naturally when solar irradiation strikes rotating surfaces of habitable planets: the resulting cycling physicochemical gradients persistently drive planetary chemistries toward "embryonic" living systems and an eventual emergence of life. To better understand the factors that lead to the emergence of life, I argue for cycling non-equilibrium experiments with multicomponent chemical systems designed to represent the evolving chemistry of Hadean Earth ("prebiotic soups"). Specifically, I suggest experimentation with chemical engineering simulators of Hadean Earth to observe and analyze (i) the appearances and phase separations of surface active and polymeric materials as precursors of the first "cell envelopes" (membranes) and (ii) the accumulations, commingling, and co-reactivity of chemicals from atmospheric, oceanic, and terrestrial locations.

  1. Statistical core design methodology using the VIPRE thermal-hydraulics code

    International Nuclear Information System (INIS)

    Lloyd, M.W.; Feltus, M.A.

    1995-01-01

    An improved statistical core design methodology for developing a computational departure from nucleate boiling ratio (DNBR) correlation has been developed and applied in order to analyze the nominal 1.3 DNBR limit on Westinghouse Pressurized Water Reactor (PWR) cores. This analysis, although limited in scope, found that the DNBR limit can be reduced from 1.3 to some lower value and be accurate within an adequate confidence level of 95%, for three particular FSAR operational transients: turbine trip, complete loss of flow, and inadvertent opening of a pressurizer relief valve. The VIPRE-01 thermal-hydraulics code, the SAS/STAT statistical package, and the EPRI/Columbia University DNBR experimental data base were used in this research to develop the Pennsylvania State Statistical Core Design Methodology (PSSCDM). The VIPRE code was used to perform the necessary sensitivity studies and generate the EPRI correlation-calculated DNBR predictions. The SAS package used for these EPRI DNBR correlation predictions from VIPRE as a data set to determine the best fit for the empirical model and to perform the statistical analysis. (author)

  2. Statistical considerations of graphite strength for assessing design allowable stresses

    International Nuclear Information System (INIS)

    Ishihara, M.; Mogi, H.; Ioka, I.; Arai, T.; Oku, T.

    1987-01-01

    Several aspects of statistics need to be considered to determine design allowable stresses for graphite structures. These include: 1) Statistical variation of graphite material strength. 2) Uncertainty of calculated stress. 3) Reliability (survival probability) required from operational and safety performance of graphite structures. This paper deals with some statistical considerations of structural graphite for assessing design allowable stress. Firstly, probability distribution functions of tensile and compressive strengths are investigated on experimental Very High Temperature candidated graphites. Normal, logarithmic normal and Weibull distribution functions are compared in terms of coefficient of correlation to measured strength data. This leads to the adaptation of normal distribution function. Then, the relation between factor of safety and fracture probability is discussed on the following items: 1) As the graphite strength is more variable than metalic material's strength, the effect of strength variation to the fracture probability is evaluated. 2) Fracture probability depending on survival probability of 99 ∼ 99.9 (%) with confidence level of 90 ∼ 95 (%) is discussed. 3) As the material properties used in the design analysis are usually the mean values of their variation, the additional effect of these variations on the fracture probability is discussed. Finally, the way to assure the minimum ultimate strength with required survival probability with confidence level is discussed in view of statistical treatment of the strength data from varying sample numbers in a material acceptance test. (author)

  3. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  4. Development of seasonal heat storage based on stable supercooling of a sodium acetate water mixture

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Andersen, Elsa

    2012-01-01

    A number of heat storage modules for seasonal heat storages based on stable supercooling of a sodium acetate water mixture have been tested by means of experiments in a heat storage test facility. The modules had different volumes and designs. Further, different methods were used to transfer heat...... to and from the sodium acetate water mixture in the modules. By means of the experiments: • The heat exchange capacity rates to and from the sodium acetate water mixture in the heat storage modules were determined for different volume flow rates. • The heat content of the heat storage modules were determined....... • The reliability of the supercooling was elucidated for the heat storage modules for different operation conditions. • The reliability of a cooling method used to start solidification of the supercooled sodium acetate water mixture was elucidated. The method is making use of boiling CO2 in a small tank in good...

  5. Optimization of the fabrication of novel stealth PLA-based nanoparticles by dispersion polymerization using D-optimal mixture design.

    Science.gov (United States)

    Adesina, Simeon K; Wight, Scott A; Akala, Emmanuel O

    2014-11-01

    Nanoparticle size is important in drug delivery. Clearance of nanoparticles by cells of the reticuloendothelial system has been reported to increase with increase in particle size. Further, nanoparticles should be small enough to avoid lung or spleen filtering effects. Endocytosis and accumulation in tumor tissue by the enhanced permeability and retention effect are also processes that are influenced by particle size. We present the results of studies designed to optimize cross-linked biodegradable stealth polymeric nanoparticles fabricated by dispersion polymerization. Nanoparticles were fabricated using different amounts of macromonomer, initiators, crosslinking agent and stabilizer in a dioxane/DMSO/water solvent system. Confirmation of nanoparticle formation was by scanning electron microscopy (SEM). Particle size was measured by dynamic light scattering (DLS). D-optimal mixture statistical experimental design was used for the experimental runs, followed by model generation (Scheffe polynomial) and optimization with the aid of a computer software. Model verification was done by comparing particle size data of some suggested solutions to the predicted particle sizes. Data showed that average particle sizes follow the same trend as predicted by the model. Negative terms in the model corresponding to the cross-linking agent and stabilizer indicate the important factors for minimizing particle size.

  6. Experience-based design for integrating the patient care experience into healthcare improvement: Identifying a set of reliable emotion words.

    Science.gov (United States)

    Russ, Lauren R; Phillips, Jennifer; Brzozowicz, Keely; Chafetz, Lynne A; Plsek, Paul E; Blackmore, C Craig; Kaplan, Gary S

    2013-12-01

    Experience-based design is an emerging method used to capture the emotional content of patient and family member healthcare experiences, and can serve as the foundation for patient-centered healthcare improvement. However, a core tool-the experience-based design questionnaire-requires words with consistent emotional meaning. Our objective was to identify and evaluate an emotion word set reliably categorized across the demographic spectrum as expressing positive, negative, or neutral emotions for experience-based design improvement work. We surveyed 407 patients, family members, and healthcare workers in 2011. Participants designated each of 67 potential emotion words as positive, neutral, or negative based on their emotional perception of the word. Overall agreement was assessed using the kappa statistic. Words were selected for retention in the final emotion word set based on 80% simple agreement on classification of meaning across subgroups. The participants were 47.9% (195/407) patients, 19.4% (33/407) family members and 32.7% (133/407) healthcare staff. Overall agreement adjusted for chance was moderate (k=0.55). However, agreement for positive (k=0.69) and negative emotions (k=0.68) was substantially higher, while agreement in the neutral category was low (k=0.11). There were 20 positive, 1 neutral, and 14 negative words retained for the final experience-based design emotion word set. We identified a reliable set of emotion words for experience questionnaires to serve as the foundation for patient-centered, experience-based redesign of healthcare. Incorporation of patient and family member perspectives in healthcare requires reliable tools to capture the emotional content of care touch points. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  8. Effect of the key mixture parameters on shrinkage of reactive powder concrete.

    Science.gov (United States)

    Ahmad, Shamsad; Zubair, Ahmed; Maslehuddin, Mohammed

    2014-01-01

    Reactive powder concrete (RPC) mixtures are reported to have excellent mechanical and durability characteristics. However, such concrete mixtures having high amount of cementitious materials may have high early shrinkage causing cracking of concrete. In the present work, an attempt has been made to study the simultaneous effects of three key mixture parameters on shrinkage of the RPC mixtures. Considering three different levels of the three key mixture factors, a total of 27 mixtures of RPC were prepared according to 3(3) factorial experiment design. The specimens belonging to all 27 mixtures were monitored for shrinkage at different ages over a total period of 90 days. The test results were plotted to observe the variation of shrinkage with time and to see the effects of the key mixture factors. The experimental data pertaining to 90-day shrinkage were used to conduct analysis of variance to identify significance of each factor and to obtain an empirical equation correlating the shrinkage of RPC with the three key mixture factors. The rate of development of shrinkage at early ages was higher. The water to binder ratio was found to be the most prominent factor followed by cement content with the least effect of silica fume content.

  9. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  10. Examining the Internal Validity and Statistical Precision of the Comparative Interrupted Time Series Design by Comparison with a Randomized Experiment

    Science.gov (United States)

    St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly

    2014-01-01

    Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…

  11. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  12. Induced polarization of clay-sand mixtures: experiments and modeling

    International Nuclear Information System (INIS)

    Okay, G.; Leroy, P.; Tournassat, C.; Ghorbani, A.; Jougnot, D.; Cosenza, P.; Camerlynck, C.; Cabrera, J.; Florsch, N.; Revil, A.

    2012-01-01

    Document available in extended abstract form only. Frequency-domain induced polarization (IP) measurements consist of imposing an alternative sinusoidal electrical current (AC) at a given frequency and measuring the resulting electrical potential difference between two other non-polarizing electrodes. The magnitude of the conductivity and the phase lag between the current and the difference of potential can be expressed into a complex conductivity with the in-phase representing electro-migration and a quadrature conductivity representing the reversible storage of electrical charges (capacitive effect) of the porous material. Induced polarization has become an increasingly popular geophysical method for hydrogeological and environmental applications. These applications include for instance the characterization of clay materials used as permeability barriers in landfills or to contain various types of contaminants including radioactive wastes. The goal of our study is to get a better understanding of the influence of the clay content, clay mineralogy, and pore water salinity upon complex conductivity measurements of saturated clay-sand mixtures in the frequency range ∼1 mHz-12 kHz. The complex conductivity of saturated unconsolidated sand-clay mixtures was experimentally investigated using two types of clay minerals, kaolinite and smectite in the frequency range 1.4 mHz - 12 kHz. Four different types of samples were used, two containing mainly kaolinite (80% of the mass, the remaining containing 15% of smectite and 5% of illite/muscovite; 95% of kaolinite and 5% of illite/muscovite), and the two others containing mainly Na-smectite or Na-Ca-smectite (95% of the mass; bentonite). The experiments were performed with various clay contents (1, 5, 20, and 100% in volume of the sand-clay mixture) and salinities (distilled water, 0.1 g/L, 1 g/L, and 10 g/L NaCl solution). In total, 44 saturated clay or clay-sand mixtures were prepared. Induced polarization measurements

  13. Increasing the statistical significance of entanglement detection in experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)

    2010-07-01

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.

  14. Statistical evaluation of design-error related accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1980-01-01

    In a recently published paper (Campbell and Ott, 1979), a general methodology was proposed for the statistical evaluation of design-error related accidents. The evaluation aims at an estimate of the combined residual frequency of yet unknown types of accidents lurking in a certain technological system. Here, the original methodology is extended, as to apply to a variety of systems that evolves during the development of large-scale technologies. A special categorization of incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of the nuclear power reactor technology, considering serious accidents that involve in the accident-progression a particular design inadequacy

  15. Research design and statistical methods in Indian medical journals: a retrospective survey.

    Science.gov (United States)

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, pdesign decreased significantly (χ2=16.783, Φ=0.12 pdesigns has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, presearch seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of

  16. Design and implementation of new design of numerical experiments for non linear models; Conception et mise en oeuvre de nouvelles methodes d'elaboration de plans d'experiences pour l'apprentissage de modeles non lineaires

    Energy Technology Data Exchange (ETDEWEB)

    Gazut, St

    2007-03-15

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  17. The National Environmental Respiratory Center (NERC) experiment in multi-pollutant air quality health research: IV. Vascular effects of repeated inhalation exposure to a mixture of five inorganic gases.

    Science.gov (United States)

    Mauderly, J L; Kracko, D; Brower, J; Doyle-Eisele, M; McDonald, J D; Lund, A K; Seilkop, S K

    2014-09-01

    An experiment was conducted to test the hypothesis that a mixture of five inorganic gases could reproduce certain central vascular effects of repeated inhalation exposure of apolipoprotein E-deficient mice to diesel or gasoline engine exhaust. The hypothesis resulted from preceding multiple additive regression tree (MART) analysis of a composition-concentration-response database of mice exposed by inhalation to the exhausts and other complex mixtures. The five gases were the predictors most important to MART models best fitting the vascular responses. Mice on high-fat diet were exposed 6 h/d, 7 d/week for 50 d to clean air or a mixture containing 30.6 ppm CO, 20.5 ppm NO, 1.4 ppm NO₂, 0.5 ppm SO₂, and 2.0 ppm NH₃ in air. The gas concentrations were below the maxima in the preceding studies but in the range of those in exhaust exposure levels that caused significant effects. Five indicators of stress and pro-atherosclerotic responses were measured in aortic tissue. The exposure increased all five response indicators, with the magnitude of effect and statistical significance varying among the indicators and depending on inclusion or exclusion of an apparent outlying control. With the outlier excluded, three responses approximated predicted values and two fell below predictions. The results generally supported evidence that the five gases drove the effects of exhaust, and thus supported the potential of the MART approach for identifying putative causal components of complex mixtures.

  18. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    Science.gov (United States)

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Vienna, John D.; Crum, Jarrod V.

    2015-01-01

    This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO 3 , has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO 3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO 3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer-layer glasses. The experimental

  20. Design for experience where technology meets design and strategy

    CERN Document Server

    Kim, Jinwoo

    2015-01-01

    Presents a strategic perspective and design methodology that guide the process of developing digital products and services that provide 'real experience' to users. Only when the material experienced runs its course to fulfilment is it then regarded as 'real experience' that is distinctively senseful, evaluated as valuable, and harmoniously related to others. Based on the theoretical background of human experience, the book focuses on these three questions: How can we understand the current dominant designs of digital products and services? What are the user experience factor

  1. Subdomain sensitive statistical parsing using raw corpora

    NARCIS (Netherlands)

    Plank, B.; Sima'an, K.

    2008-01-01

    Modern statistical parsers are trained on large annotated corpora (treebanks). These treebanks usually consist of sentences addressing different subdomains (e.g. sports, politics, music), which implies that the statistics gathered by current statistical parsers are mixtures of subdomains of language

  2. Statistical modeling of static strengths of nuclear graphites with relevance to structural design

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-02-01

    Use of graphite materials for structural members poses a problem as to how to take into account of statistical properties of static strength, especially tensile fracture stresses, in component structural design. The present study concerns comprehensive examinations on statistical data base and modelings on nuclear graphites. First, the report provides individual samples and their analyses on strengths of IG-110 and PGX graphites for HTTR components. Those statistical characteristics on other HTGR graphites are also exemplified from the literature. Most of statistical distributions of individual samples are found to be approximately normal. The goodness of fit to normal distributions is more satisfactory with larger sample sizes. Molded and extruded graphites, however, possess a variety of statistical properties depending of samples from different with-in-log locations and/or different orientations. Second, the previous statistical models including the Weibull theory are assessed from the viewpoint of applicability to design procedures. This leads to a conclusion that the Weibull theory and its modified ones are satisfactory only for limited parts of tensile fracture behavior. They are not consistent for whole observations. Only normal statistics are justifiable as practical approaches to discuss specified minimum ultimate strengths as statistical confidence limits for individual samples. Third, the assessment of various statistical models emphasizes the need to develop advanced analytical ones which should involve modeling of microstructural features of actual graphite materials. Improvements of other structural design methodologies are also presented. (author)

  3. Study of decolorisation of binary dye mixture by response surface methodology.

    Science.gov (United States)

    Khamparia, Shraddha; Jaspal, Dipika

    2017-10-01

    Decolorisation of a complex mixture of two different classes of textile dyes Direct Red 81 (DR81) and Rhodamine B (RHB), simulating one of the most important condition in real textile effluent was investigated onto deoiled Argemone Mexicana seeds (A. Mexicana). The adsorption behaviour of DR81 and RHB dyes was simultaneously analyzed in the mixture using derivative spectrophotometric method. Central composite design (CCD) was employed for designing the experiments for this complex binary mixture where significance of important parameters and possible interactions were analyzed by response surface methodology (RSM). Maximum adsorption of DR81 and RHB by A. Mexicana was obtained at 53 °C after 63.33 min with 0.1 g of adsorbent and 8 × 10 -6  M DR81, 12 × 10 -6  M RHB with composite desirability of 0.99. The predicted values for percentage removal of dyes from the mixture were in good agreement with the experimental values with R 2 > 96% for both the dyes. CCD superimposed RSM confirmed that presence of different dyes in a solution created a competition for the adsorbent sites and hence interaction of dyes was one of the most important factor to be studied to simulate the real effluent. The adsorbent showed remarkable adsorption capacities for both the dyes in the mixture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Cogging torque optimization in surface-mounted permanent-magnet motors by using design of experiment

    Energy Technology Data Exchange (ETDEWEB)

    Abbaszadeh, K., E-mail: Abbaszadeh@kntu.ac.ir [Department of Electrical Engineering, K.N. Toosi University of Technology, Tehran (Iran, Islamic Republic of); Rezaee Alam, F.; Saied, S.A. [Department of Electrical Engineering, K.N. Toosi University of Technology, Tehran (Iran, Islamic Republic of)

    2011-09-15

    Graphical abstract: Magnet segment arrangement in cross section view of one pole for PM machine. Display Omitted Highlights: {yields} Magnet segmentation is an effective method for the cogging torque reduction. {yields} We have used the magnet segmentation method based on the design of experiment. {yields} We have used the RSM design of the design of experiment method. {yields} We have solved optimization via surrogate models like the polynomial regression. {yields} A significant reduction of the cogging torque is obtained by using RSM. - Abstract: One of the important challenges in design of the PM electrical machines is to reduce the cogging torque. In this paper, in order to reduce the cogging torque, a new method for designing of the motor magnets is introduced to optimize of a six pole BLDC motor by using design of experiment (DOE) method. In this method the machine magnets consist of several identical segments which are shifted to a definite angle from each other. Design of experiment (DOE) methodology is used for a screening of the design space and for the generation of approximation models using response surface techniques. In this paper, optimization is often solved via surrogate models, that is, through the construction of response surface models (RSM) like polynomial regression. The experiments were performed based on the response surface methodology (RSM), as a statistical design of experiment approach, in order to investigate the effect of parameters on the response variations. In this investigation, the optimal shifting angles (factors) were identified to minimize the cogging torque. A significant reduction of cogging torque can be achieved with this approach after only a few evaluations of the coupled FE model.

  5. Cogging torque optimization in surface-mounted permanent-magnet motors by using design of experiment

    International Nuclear Information System (INIS)

    Abbaszadeh, K.; Rezaee Alam, F.; Saied, S.A.

    2011-01-01

    Graphical abstract: Magnet segment arrangement in cross section view of one pole for PM machine. Display Omitted Highlights: → Magnet segmentation is an effective method for the cogging torque reduction. → We have used the magnet segmentation method based on the design of experiment. → We have used the RSM design of the design of experiment method. → We have solved optimization via surrogate models like the polynomial regression. → A significant reduction of the cogging torque is obtained by using RSM. - Abstract: One of the important challenges in design of the PM electrical machines is to reduce the cogging torque. In this paper, in order to reduce the cogging torque, a new method for designing of the motor magnets is introduced to optimize of a six pole BLDC motor by using design of experiment (DOE) method. In this method the machine magnets consist of several identical segments which are shifted to a definite angle from each other. Design of experiment (DOE) methodology is used for a screening of the design space and for the generation of approximation models using response surface techniques. In this paper, optimization is often solved via surrogate models, that is, through the construction of response surface models (RSM) like polynomial regression. The experiments were performed based on the response surface methodology (RSM), as a statistical design of experiment approach, in order to investigate the effect of parameters on the response variations. In this investigation, the optimal shifting angles (factors) were identified to minimize the cogging torque. A significant reduction of cogging torque can be achieved with this approach after only a few evaluations of the coupled FE model.

  6. A Novel High Performance Liquid Chromatographic Method for Determination of Nystatin in Pharmaceutical Formulations by Box-Behnken Statistical Experiment Design.

    Science.gov (United States)

    Shokraneh, Farnaz; Asgharian, Ramin; Abdollahpour, Assem; Ramin, Mehdi; Montaseri, Ali; Mahboubi, Arash

    2015-01-01

    In this study a novel High Performance Liquid Chromatography for the assay of nystatin in oral and vaginal tablets were optimized and validated using Box-Behnken experimental design. The method was performed in the isocratic mode on a RP-18 column (30 °C) using a mobile phase consisting of ammonium acetate 0.05 M buffer/ Methanol mixture (30:70) and a flow-rate of 1.0 mL/min. The specificity, linearity, precision, accuracy, LOD and LOQ of the method were validated. The method was linear over the range of 5-500 µg/mL with an acceptable correlation coefficient (r(2) = 0.9996). The method's limit of detection (LOD) and quantification (LOQ) were 0.01 and 0.025 µg/mL respectively. The results indicate that this validated method can be used as an alternative method for assay of nystatin.

  7. D-Optimal mixture experimental design for stealth biodegradable crosslinked docetaxel-loaded poly-ε-caprolactone nanoparticles manufactured by dispersion polymerization.

    Science.gov (United States)

    Ogunwuyi, O; Adesina, S; Akala, E O

    2015-03-01

    We report here our efforts on the development of stealth biodegradable crosslinked poly-ε-caprolactone nanoparticles by free radical dispersion polymerization suitable for the delivery of bioactive agents. The uniqueness of the dispersion polymerization technique is that it is surfactant free, thereby obviating the problems known to be associated with the use of surfactants in the fabrication of nanoparticles for biomedical applications. Aided by a statistical software for experimental design and analysis, we used D-optimal mixture statistical experimental design to generate thirty batches of nanoparticles prepared by varying the proportion of the components (poly-ε-caprolactone macromonomer, crosslinker, initiators and stabilizer) in acetone/water system. Morphology of the nanoparticles was examined using scanning electron microscopy (SEM). Particle size and zeta potential were measured by dynamic light scattering (DLS). Scheffe polynomial models were generated to predict particle size (nm) and particle surface zeta potential (mV) as functions of the proportion of the components. Solutions were returned from simultaneous optimization of the response variables for component combinations to (a) minimize nanoparticle size (small nanoparticles are internalized into disease organs easily, avoid reticuloendothelial clearance and lung filtration) and (b) maximization of the negative zeta potential values, as it is known that, following injection into the blood stream, nanoparticles with a positive zeta potential pose a threat of causing transient embolism and rapid clearance compared to negatively charged particles. In vitro availability isotherms show that the nanoparticles sustained the release of docetaxel for 72 to 120 hours depending on the formulation. The data show that nanotechnology platforms for controlled delivery of bioactive agents can be developed based on the nanoparticles.

  8. Experimenting with a design experiment

    Directory of Open Access Journals (Sweden)

    Bakker, Judith

    2012-12-01

    Full Text Available The design experiment is an experimental research method that aims to help design and further develop new (policy instruments. For the development of a set of guidelines for the facilitation of citizens’ initiatives by local governments, we are experimenting with this method. It offers good opportunities for modeling interventions by testing their instrumental validity –the usefulness for the intended practical purposes. At the same time design experiments are also useful for evaluating the empirical validity of theoretical arguments and the further development of these arguments in the light of empirical evidence (by using e.g. the technique of pattern matching. We describe how we have applied this methodology in two cases and discuss our research approach. We encountered some unexpected difficulties, especially in the cooperation with professionals and citizens. These difficulties complicate the valid attribution of causal effects to the use of the new instrument. However, our preliminary conclusion is that design experiments are useful in our field of study

    El experimento de diseño es un método de investigación experimental que tiene como objetivo diseñar y desarrollar posteriormente nuevas herramientas (políticas. En este artículo experimentamos con este método para desarrollar un conjunto de directrices que permitan a los gobiernos locales facilitar las iniciativas ciudadanas. El método ofrece la oportunidad de modelar las intervenciones poniendo a prueba su validez instrumental (su utilidad para el fin práctico que se proponen. Al mismo tiempo, los experimentos de diseño son útiles también para evaluar la validez empírica de las discusiones teóricas y el posterior desarrollo de esas discusiones a la luz de la evidencia empírica (usando, por ejemplo, técnicas de concordancia de patrones. En este trabajo describimos cómo hemos aplicado este método a dos casos y discutimos nuestro enfoque de

  9. Statistical modelling coupled with LC-MS analysis to predict human upper intestinal absorption of phytochemical mixtures.

    Science.gov (United States)

    Selby-Pham, Sophie N B; Howell, Kate S; Dunshea, Frank R; Ludbey, Joel; Lutz, Adrian; Bennett, Louise

    2018-04-15

    A diet rich in phytochemicals confers benefits for health by reducing the risk of chronic diseases via regulation of oxidative stress and inflammation (OSI). For optimal protective bio-efficacy, the time required for phytochemicals and their metabolites to reach maximal plasma concentrations (T max ) should be synchronised with the time of increased OSI. A statistical model has been reported to predict T max of individual phytochemicals based on molecular mass and lipophilicity. We report the application of the model for predicting the absorption profile of an uncharacterised phytochemical mixture, herein referred to as the 'functional fingerprint'. First, chemical profiles of phytochemical extracts were acquired using liquid chromatography mass spectrometry (LC-MS), then the molecular features for respective components were used to predict their plasma absorption maximum, based on molecular mass and lipophilicity. This method of 'functional fingerprinting' of plant extracts represents a novel tool for understanding and optimising the health efficacy of plant extracts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Statistical tests for the Gaussian nature of primordial fluctuations through CBR experiments

    International Nuclear Information System (INIS)

    Luo, X.

    1994-01-01

    Information about the physical processes that generate the primordial fluctuations in the early Universe can be gained by testing the Gaussian nature of the fluctuations through cosmic microwave background radiation (CBR) temperature anisotropy experiments. One of the crucial aspects of density perturbations that are produced by the standard inflation scenario is that they are Gaussian, whereas seeds produced by topological defects left over from an early cosmic phase transition tend to be non-Gaussian. To carry out this test, sophisticated statistical tools are required. In this paper, we will discuss several such statistical tools, including multivariant skewness and kurtosis, Euler-Poincare characteristics, the three-point temperature correlation function, and Hotelling's T 2 statistic defined through bispectral estimates of a one-dimensional data set. The effect of noise present in the current data is discussed in detail and the COBE 53 GHz data set is analyzed. Our analysis shows that, on the large angular scale to which COBE is sensitive, the statistics are probably Gaussian. On the small angular scales, the importance of Hotelling's T 2 statistic is stressed, and the minimum sample size required to test Gaussianity is estimated. Although the current data set available from various experiments at half-degree scales is still too small, improvement of the data set by roughly a factor of 2 will be enough to test the Gaussianity statistically. On the arc min scale, we analyze the recent RING data through bispectral analysis, and the result indicates possible deviation from Gaussianity. Effects of point sources are also discussed. It is pointed out that the Gaussianity problem can be resolved in the near future by ground-based or balloon-borne experiments

  11. Optimal Cement Mixtures Containing Mineral Admixtures under Multiple and Conflicting Criteria

    Directory of Open Access Journals (Sweden)

    Nitza M. García

    2018-01-01

    Full Text Available In modern construction industry, fabrication of sustainable concrete has turned the decision-making process into a challenging endeavor. One alternative is using fly ash and nanostructured silica as cement replacements. In these modern mixtures, proper concrete bulk density, percentage of voids, and compressive strength normally cannot be optimized individually. Hereby, a decision-making strategy on the replacement of those components is presented while taking into account those three performance measurements. The relationships among those components upon concrete fabrication required a design of experiments of mixtures to characterize those mineral admixtures. This approach integrates different objective functions that are in conflict and obtains the best compromise mixtures for the performance measures being considered. This optimization strategy permitted to recommend the combined use of fly ash and nanosilica to improve the concrete properties at its early age.

  12. Supporting statistics in the workplace: experiences with two hospitals

    Directory of Open Access Journals (Sweden)

    M. Y. Mortlock

    2003-01-01

    Full Text Available This paper provides some reflections on the promotion of lifelong learning in statistics in the workplace. The initiative from which the reflections are drawn is a collaboration between a university and two public hospitals, of which one of the stated aims is to develop statistical skills among the hospitals' researchers. This is realized in the provision of ‘biostatistical clinics’ in which workplace teaching and learning of statistics takes place in one-on-one or small group situations. The central issue that is identified is the need to accommodate diversity: in backgrounds, motivations and learning needs of workplace learners (in this case medical researchers, in the workplace environments themselves and in the projects encountered. Operational issues for the statistician in providing such training are addressed. These considerations may reflect the experiences of the wider community of statisticians involved in service provision within a larger organization.

  13. The Determination of the Optimal Material Proportion in Natural Fiber-Cement Composites Using Design of Mixture Experiments

    OpenAIRE

    Aramphongphun Chuckaphun; Ungtawondee Kampanart; Chaysuwan Duangrudee

    2016-01-01

    This research aims to determine the optimal material proportion in a natural fiber-cement composite as an alternative to an asbestos fibercement composite while the materials cost is minimized and the properties still comply with Thai Industrial Standard (TIS) for applications of profile sheet roof tiles. Two experimental sets were studied in this research. First, a three-component mixture of (i) virgin natural fiber, (ii) synthetic fiber and (iii) cement was studied while the proportion of c...

  14. HAMMLAB 1999 experimental control room: design - design rationale - experiences

    International Nuclear Information System (INIS)

    Foerdestroemmen, N. T.; Meyer, B. D.; Saarni, R.

    1999-01-01

    A presentation of HAMMLAB 1999 experimental control room, and the accumulated experiences gathered in the areas of design and design rationale as well as user experiences. It is concluded that HAMMLAB 1999 experimental control room is a realistic, compact and efficient control room well suited as an Advanced NPP Control Room (ml)

  15. From classroom to online teaching: experiences in improving statistics education

    Directory of Open Access Journals (Sweden)

    Anne Porter

    2003-01-01

    Full Text Available This study used reflective practitioner methodology to investigate how to improve the quality of statistical education. During the study, this methodology, curricula, pedagogical practices, assessment and a framework for learning to learn statistics were all developed as means of improving the quality of statistical education. Also documented was the move from being a classroom teacher of statistics to a teacher who is developing learning resources for online delivery to students. For a classroom teacher, flexible delivery has meant drawing on the sights, sounds, movement, quiet and live shows. By contrast, the online teacher feels the constraints of translating activity based programs to technologically based programs. As more students have chosen to rely on online materials rather than classroom activities, the focus of improving quality has been extended to the enrichment of online resources, so that the learning experience is not second to that of the classroom.

  16. Statistical experimental design approach in coal briquetting

    Energy Technology Data Exchange (ETDEWEB)

    B. Salopek; S. Pfaff; R. Rajic

    2003-07-01

    The influence of pressure, temperature, humidity and granulation of the coal upon the resistance to pressure and the water absorption of the briquettes has been tested, with the aim to examine how each of the two dependent variables changes depending on the values assumed by any of the four independent variables and which of the mentioned independent variables significantly influences the dependent ones. The full factorial design with 16 experiments and the central composite design with 27 experiments have been applied. The influence of the independent variables upon the dependent ones has been examined by applying the analysis of variance. The influence values of individual factors and their interaction upon the dependent variables have been stated as well as coefficients of curvilinear equation. 2 refs., 2 figs., 5 tabs.

  17. Additive mixture effects of estrogenic chemicals in human cell-based assays can be influenced by inclusion of chemicals with differing effect profiles.

    Directory of Open Access Journals (Sweden)

    Richard Mark Evans

    Full Text Available A growing body of experimental evidence indicates that the in vitro effects of mixtures of estrogenic chemicals can be well predicted from the estrogenicity of their components by the concentration addition (CA concept. However, some studies have observed small deviations from CA. Factors affecting the presence or observation of deviations could include: the type of chemical tested; number of mixture components; mixture design; and assay choice. We designed mixture experiments that address these factors, using mixtures with high numbers of components, chemicals from diverse chemical groups, assays with different in vitro endpoints and different mixture designs and ratios. Firstly, the effects of mixtures composed of up to 17 estrogenic chemicals were examined using estrogenicity assays with reporter-gene (ERLUX and cell proliferation (ESCREEN endpoints. Two mixture designs were used: 1 a 'balanced' design with components present in proportion to a common effect concentration (e.g. an EC(10 and 2 a 'non-balanced' design with components in proportion to potential human tissue concentrations. Secondly, the individual and simultaneous ability of 16 potential modulator chemicals (each with minimal estrogenicity to influence the assay outcome produced by a reference mixture of estrogenic chemicals was examined. Test chemicals included plasticizers, phthalates, metals, PCBs, phytoestrogens, PAHs, heterocyclic amines, antioxidants, UV filters, musks, PBDEs and parabens. In all the scenarios tested, the CA concept provided a good prediction of mixture effects. Modulation studies revealed that chemicals possessing minimal estrogenicity themselves could reduce (negatively modulate the effect of a mixture of estrogenic chemicals. Whether the type of modulation we observed occurs in practice most likely depends on the chemical concentrations involved, and better information is required on likely human tissue concentrations of estrogens and of potential

  18. Additive mixture effects of estrogenic chemicals in human cell-based assays can be influenced by inclusion of chemicals with differing effect profiles.

    Science.gov (United States)

    Evans, Richard Mark; Scholze, Martin; Kortenkamp, Andreas

    2012-01-01

    A growing body of experimental evidence indicates that the in vitro effects of mixtures of estrogenic chemicals can be well predicted from the estrogenicity of their components by the concentration addition (CA) concept. However, some studies have observed small deviations from CA. Factors affecting the presence or observation of deviations could include: the type of chemical tested; number of mixture components; mixture design; and assay choice. We designed mixture experiments that address these factors, using mixtures with high numbers of components, chemicals from diverse chemical groups, assays with different in vitro endpoints and different mixture designs and ratios. Firstly, the effects of mixtures composed of up to 17 estrogenic chemicals were examined using estrogenicity assays with reporter-gene (ERLUX) and cell proliferation (ESCREEN) endpoints. Two mixture designs were used: 1) a 'balanced' design with components present in proportion to a common effect concentration (e.g. an EC(10)) and 2) a 'non-balanced' design with components in proportion to potential human tissue concentrations. Secondly, the individual and simultaneous ability of 16 potential modulator chemicals (each with minimal estrogenicity) to influence the assay outcome produced by a reference mixture of estrogenic chemicals was examined. Test chemicals included plasticizers, phthalates, metals, PCBs, phytoestrogens, PAHs, heterocyclic amines, antioxidants, UV filters, musks, PBDEs and parabens. In all the scenarios tested, the CA concept provided a good prediction of mixture effects. Modulation studies revealed that chemicals possessing minimal estrogenicity themselves could reduce (negatively modulate) the effect of a mixture of estrogenic chemicals. Whether the type of modulation we observed occurs in practice most likely depends on the chemical concentrations involved, and better information is required on likely human tissue concentrations of estrogens and of potential modulators

  19. Mixture distributions of wind speed in the UAE

    Science.gov (United States)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.

  20. Mixture Genotoxicity of 2,4-Dichlorophenoxyacetic Acid, Acrylamide, and Maleic Hydrazide on Human Caco-2 Cells Assessed with Comet Assay

    DEFF Research Database (Denmark)

    Syberg, Kristian; Binderup, Mona-Lise; Cedergreen, Nina

    2015-01-01

    Assessment of genotoxic properties of chemicals is mainly conducted only for single chemicals, without taking mixture genotoxic effects into consideration. The current study assessed mixture effects of the three known genotoxic chemicals, 2,4-dichlorophenoxyacetic acid (2,4-D), acrylamide (AA......), and maleic hydrazide (MH), in an experiment with a fixed ratio design setup. The genotoxic effects were assessed with the single-cell gel electrophoresis assay (comet assay) for both single chemicals and the ternary mixture. The concentration ranges used were 0-1.4, 0-20, and 0-37.7 mM for 2,4-D, AA, and MH......, respectively. Mixture toxicity was tested with a fixed ratio design at a 10:23:77% ratio for 2.4-D:AA:MH. Results indicated that the three chemicals yielded a synergistic mixture effect. It is not clear which mechanisms are responsible for this interaction. A few possible interactions are discussed...

  1. Designing interactive technology for crowd experiences - beyond sanitization

    DEFF Research Database (Denmark)

    Veerasawmy, Rune

    2014-01-01

    This dissertation concerns the topic on designing interactive technology for crowd expe- riences. It takes the outset in the experience-oriented design approach within interaction design, exploring the research question how can we conceptually understand and design interactive technology for crowd...... experiences? Through theoretical studies of sociological crowd theory and pragmatist perspectives on experience combined with design exper- iments at sporting events this dissertation establishes an conceptual understanding of crowd experience. The outcome of this work is furthermore synthesized...... in a conceptual model of social experiences that presents crowd experiences as a distinct type of social experience. This is different from what previously have been explored within experi- ence-oriented design. This dissertation is composed of four research papers framed by an overview that summarizes...

  2. Graphene/TiO2/ZSM-5 composites synthesized by mixture design were used for photocatalytic degradation of oxytetracycline under visible light: Mechanism and biotoxicity

    Science.gov (United States)

    Hu, Xin-Yan; Zhou, Kefu; Chen, Bor-Yann; Chang, Chang-Tang

    2016-01-01

    This first-attempt study revealed mixture design of experiments to obtain the most promising composites of TiO2 loaded on zeolite and graphene for maximal photocatalytic degradation of oxytetracycline (OTC). The optimal weight ratio of graphene, titanium dioxide (TiO2), and zeolite was 1:8:1 determined via experimental design of simplex lattice mixture. The composite material was characterized by XRD, UV-vis, TEM and EDS analysis. The findings showed the composite material had a higher stability and a stronger absorption of the visible light. In addition, it was uniformly dispersed with promising adsorption characteristics. OTC was used as model toxicant to evaluate the photodegradation efficiency of the GTZ (1:8:1). At optimal operating conditions (i.e., pH 7 and 25 °C), complete degradation (ca. 100%) was achieved in 180 min. The biotoxicity of the degraded intermediates of OTC on cell growth of Escherichia coli DH5α were also assayed. After 180 min photocatalytic treatment, OTC solution treated by GTZ (1:8:1) showed insignificant biotoxicity to receptor DH5α cells. Furthermore, EDTA (hole scavengers) and t-BuOH (radical scavengers) were used to detect the main active oxidative species in the system. The results showed that the holes are the main oxidation species in the photocatalytic process.

  3. Increasing the statistical significance of entanglement detection in experiments.

    Science.gov (United States)

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  4. Eye tracking in user experience design

    CERN Document Server

    Romano Bergstorm, Jennifer

    2014-01-01

    Eye Tracking for User Experience Design explores the many applications of eye tracking to better understand how users view and interact with technology. Ten leading experts in eye tracking discuss how they have taken advantage of this new technology to understand, design, and evaluate user experience. Real-world stories are included from these experts who have used eye tracking during the design and development of products ranging from information websites to immersive games. They also explore recent advances in the technology which tracks how users interact with mobile devices, large-screen displays and video game consoles. Methods for combining eye tracking with other research techniques for a more holistic understanding of the user experience are discussed. This is an invaluable resource to those who want to learn how eye tracking can be used to better understand and design for their users. * Includes highly relevant examples and information for those who perform user research and design interactive experi...

  5. A Modified Jonckheere Test Statistic for Ordered Alternatives in Repeated Measures Design

    Directory of Open Access Journals (Sweden)

    Hatice Tül Kübra AKDUR

    2016-09-01

    Full Text Available In this article, a new test based on Jonckheere test [1] for  randomized blocks which have dependent observations within block is presented. A weighted sum for each block statistic rather than the unweighted sum proposed by Jonckheereis included. For Jonckheere type statistics, the main assumption is independency of observations within block. In the case of repeated measures design, the assumption of independence is violated. The weighted Jonckheere type statistic for the situation of dependence for different variance-covariance structure and the situation based on ordered alternative hypothesis structure of each block on the design is used. Also, the proposed statistic is compared to the existing test based on Jonckheere in terms of type I error rates by performing Monte Carlo simulation. For the strong correlations, circular bootstrap version of the proposed Jonckheere test provides lower rates of type I error.

  6. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  7. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.

    Science.gov (United States)

    Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  8. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  9. Deliberate ignition of hydrogen-air-steam mixtures in condensing steam environments

    International Nuclear Information System (INIS)

    Blanchat, T.K.; Stamps, D.W.

    1997-05-01

    Large scale experiments were performed to determine the effectiveness of thermal glow plug igniters to burn hydrogen in a condensing steam environment due to the presence of water sprays. The experiments were designed to determine if a detonation or accelerated flame could occur in a hydrogen-air-steam mixture which was initially nonflammable due to steam dilution but was rendered flammable by rapid steam condensation due to water sprays. Eleven Hydrogen Igniter Tests were conducted in the test vessel. The vessel was instrumented with pressure transducers, thermocouple rakes, gas grab sample bottles, hydrogen microsensors, and cameras. The vessel contained two prototypic engineered systems: (1) a deliberate hydrogen ignition system and (2) a water spray system. Experiments were conducted under conditions scaled to be nearly prototypic of those expected in Advanced Light Water Reactors (such as the Combustion Engineering (CE) System 80+), with prototypic spray drop diameter, spray mass flux, steam condensation rates, hydrogen injection flow rates, and using the actual proposed plant igniters. The lack of any significant pressure increase during the majority of the burn and condensation events signified that localized, benign hydrogen deflagration(s) occurred with no significant pressure load on the containment vessel. Igniter location did not appear to be a factor in the open geometry. Initially stratified tests with a stoichiometric mixture in the top showed that the water spray effectively mixes the initially stratified atmosphere prior to the deflagration event. All tests demonstrated that thermal glow plugs ignite hydrogen-air-steam mixtures under conditions with water sprays near the flammability limits previously determined for hydrogen-air-steam mixtures under quiescent conditions. This report describes these experiments, gives experimental results, and provides interpretation of the results. 12 refs., 127 figs., 16 tabs

  10. Optimization of glibenclamide tablet composition through the combined use of differential scanning calorimetry and D-optimal mixture experimental design.

    Science.gov (United States)

    Mura, P; Furlanetto, S; Cirri, M; Maestrelli, F; Marras, A M; Pinzauti, S

    2005-02-07

    A systematic analysis of the influence of different proportions of excipients on the stability of a solid dosage form was carried out. In particular, a d-optimal mixture experimental design was applied for the evaluation of glibenclamide compatibility in tablet formulations, consisting of four classic excipients (natrosol as binding agent, stearic acid as lubricant, sorbitol as diluent and cross-linked polyvinylpyrrolidone as disintegrant). The goal was to find the mixture component proportions which correspond to the optimal drug melting parameters, i.e. its maximum stability, using differential scanning calorimetry (DSC) to quickly obtain information about possible interactions among the formulation components. The absolute value of the difference between the melting peak temperature of pure drug endotherm and that in each analysed mixture and the absolute value of the difference between the enthalpy of the pure glibenclamide melting peak and that of its melting peak in the different analyzed mixtures, were chosen as indexes of the drug-excipient interaction degree.

  11. Mixture design of rice flour, maize starch and wheat starch for optimization of gluten free bread quality.

    Science.gov (United States)

    Mancebo, Camino M; Merino, Cristina; Martínez, Mario M; Gómez, Manuel

    2015-10-01

    Gluten-free bread production requires gluten-free flours or starches. Rice flour and maize starch are two of the most commonly used raw materials. Over recent years, gluten-free wheat starch is available on the market. The aim of this research was to optimize mixtures of rice flour, maize starch and wheat starch using an experimental mixture design. For this purpose, dough rheology and its fermentation behaviour were studied. Quality bread parameters such as specific volume, texture, cell structure, colour and acceptability were also analysed. Generally, starch incorporation reduced G* and increased the bread specific volume and cell density, but the breads obtained were paler than the rice flour breads. Comparing the starches, wheat starch breads had better overall acceptability and had a greater volume than maize-starch bread. The highest value for sensorial acceptability corresponded to the bread produced with a mixture of rice flour (59 g/100 g) and wheat starch (41 g/100 g).

  12. A study of finite mixture model: Bayesian approach on financial time series data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  13. Factorial Design Approach in Proportioning Prestressed Self-Compacting Concrete.

    Science.gov (United States)

    Long, Wu-Jian; Khayat, Kamal Henri; Lemieux, Guillaume; Xing, Feng; Wang, Wei-Lun

    2015-03-13

    In order to model the effect of mixture parameters and material properties on the hardened properties of, prestressed self-compacting concrete (SCC), and also to investigate the extensions of the statistical models, a factorial design was employed to identify the relative significance of these primary parameters and their interactions in terms of the mechanical and visco-elastic properties of SCC. In addition to the 16 fractional factorial mixtures evaluated in the modeled region of -1 to +1, eight axial mixtures were prepared at extreme values of -2 and +2 with the other variables maintained at the central points. Four replicate central mixtures were also evaluated. The effects of five mixture parameters, including binder type, binder content, dosage of viscosity-modifying admixture (VMA), water-cementitious material ratio (w/cm), and sand-to-total aggregate ratio (S/A) on compressive strength, modulus of elasticity, as well as autogenous and drying shrinkage are discussed. The applications of the models to better understand trade-offs between mixture parameters and carry out comparisons among various responses are also highlighted. A logical design approach would be to use the existing model to predict the optimal design, and then run selected tests to quantify the influence of the new binder on the model.

  14. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  15. Real Life Experiences with Experience Design

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Halskov, Kim

    2006-01-01

    technologies for knowledge dissemination and marketing, in cooperation with public institutions and businesses. We argue that collaborative formulation of core design intentions and values is a valuable instrument in guiding experience design processes, and present three cases from this project, two of which...... resulted in interactive installations. The case installations range from walk-up-and-use consoles, to immersive, responsive, environments based on bodily interaction. We compare the installations, and discuss the interrelations between the resulting interfaces and the intentions for creating...

  16. Effects of the cellulose, xylan and lignin constituents on biomass pyrolysis characteristics and bio-oil composition using the Simplex Lattice Mixture Design method

    International Nuclear Information System (INIS)

    Fan, Yongsheng; Cai, Yixi; Li, Xiaohua; Jiao, Lihua; Xia, Jisheng; Deng, Xiuli

    2017-01-01

    Highlights: • Simplex Lattice Mixture Design was firstly applied to study biomass pyrolysis process. • Interactions between the constituents had effects on the biomass pyrolysis behavior. • Biomass pyrolysis behavior can be predicted based on the ratios of three constituents. • Bio-oil composition was affected by the constituents and their pyrolysis products. - Abstract: In order to clarify the relationships between biomass pyrolysis mechanism and its main constituents. The effects of main constituents on biomass pyrolysis characteristics were firstly determined by thermo-gravimetric analysis based on the Simplex Lattice Mixture Design to investigate that whether the prediction of the pyrolysis behavior of a certain lignocellulosic biomass is possible when its main constituent contents are known. The results showed that there are constituent interactions in the pyrolysis process, which can be intuitively reflected through the change laws of kinetics parameters. The mathematical models for calculating kinetics values were established, and the models were proved to be valid for predicting lignocellulosic biomass pyrolysis behavior. In addition, the effects of biomass constituents on bio-oil compositions were explored by subsequent vacuum pyrolysis experiments. The xylan pyrolysis had a certain inhibitory effect on the pyrolysis of cellulose, and the pyrolysis products of lignin might promote the further decomposition of sugars from cellulose pyrolysis, while the interaction between xylan and lignin had a little effect on the bio-oil composition.

  17. A simple approach to polymer mixture miscibility.

    Science.gov (United States)

    Higgins, Julia S; Lipson, Jane E G; White, Ronald P

    2010-03-13

    Polymeric mixtures are important materials, but the control and understanding of mixing behaviour poses problems. The original Flory-Huggins theoretical approach, using a lattice model to compute the statistical thermodynamics, provides the basic understanding of the thermodynamic processes involved but is deficient in describing most real systems, and has little or no predictive capability. We have developed an approach using a lattice integral equation theory, and in this paper we demonstrate that this not only describes well the literature data on polymer mixtures but allows new insights into the behaviour of polymers and their mixtures. The characteristic parameters obtained by fitting the data have been successfully shown to be transferable from one dataset to another, to be able to correctly predict behaviour outside the experimental range of the original data and to allow meaningful comparisons to be made between different polymer mixtures.

  18. Engineering design of the FRX-C experiment

    International Nuclear Information System (INIS)

    Kewish, R.W. Jr.; Bartsch, R.R.; Siemon, R.E.

    1981-01-01

    Research on Compact Toroid (CT) configurations has been greatly accelerated in the last few years because of their potential for providing a practical and economical fusion system. Los Alamos research is being concentrated on two types of configurations: (1) magnetized-gun-produced Spheromaks (configurations that contain a mixture of toroidal and poloidal fields); and (2) field-reversed configurations (FRCs) that contain purely poloidal magnetic field. This paper describes the design of FRX-C, a field-reversed theta pinch used to form FRCs

  19. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    Science.gov (United States)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  20. Model-based experimental design for assessing effects of mixtures of chemicals

    NARCIS (Netherlands)

    Baas, J.; Stefanowicz, A.M.; Klimek, B.; Laskowski, R.; Kooijman, S.A.L.M.

    2010-01-01

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for

  1. Phase behavior of binary polybutadiene copolymer mixtures as an example of weakly interacting polymers

    CERN Document Server

    Schwahn, D

    2002-01-01

    Binary blends of statistical polybutadiene copolymers of different vinyl content and molar volume were explored by small-angle neutron scattering. These samples represent the most simple class of statistical copolymer mixtures. In spite of this simplicity, changes in vinyl content, molar volume, and deuterium and hydrogen content of the chains give rise to strong effects; phase separation occurs from minus 230 C to more than plus 200 C and can even reverse from an enthalpically driven one at low temperatures to an entropically driven one at high temperatures. The entropic and enthalpic terms of the Flory-Huggins parameter as determined from the experiment are in excellent agreement with lattice cluster theory calculations. (orig.)

  2. Application of mixture experimental design in the formulation and optimization of matrix tablets containing carbomer and hydroxy-propylmethylcellulose.

    Science.gov (United States)

    Petrovic, Aleksandra; Cvetkovic, Nebojsa; Ibric, Svetlana; Trajkovic, Svetlana; Djuric, Zorica; Popadic, Dragica; Popovic, Radmila

    2009-12-01

    Using mixture experimental design, the effect of carbomer (Carbopol((R)) 971P NF) and hydroxypropylmethylcellulose (Methocel((R)) K100M or Methocel((R)) K4M) combination on the release profile and on the mechanism of drug liberation from matrix tablet was investigated. The numerical optimization procedure was also applied to establish and obtain formulation with desired drug release. The amount of TP released, release rate and mechanism varied with carbomer ratio in total matrix and HPMC viscosity. Increasing carbomer fractions led to a decrease in drug release. Anomalous diffusion was found in all matrices containing carbomer, while Case - II transport was predominant for tablet based on HPMC only. The predicted and obtained profiles for optimized formulations showed similarity. Those results indicate that Simplex Lattice Mixture experimental design and numerical optimization procedure can be applied during development to obtain sustained release matrix formulation with desired release profile.

  3. Quasi-Chemical PC-SAFT: An Extended Perturbed Chain-Statistical Associating Fluid Theory for Lattice-Fluid Mixtures.

    Science.gov (United States)

    Parvaneh, Khalil; Shariati, Alireza

    2017-09-07

    In this study, a new modification of the perturbed chain-statistical associating fluid theory (PC-SAFT) has been proposed by incorporating the lattice fluid theory of Guggenheim as an additional term to the original PC-SAFT terms. As the proposed model has one more term than the PC-SAFT, a new mixing rule has been developed especially for the new additional term, while for the conventional terms of the PC-SAFT, the one-fluid mixing rule is used. In order to evaluate the proposed model, the vapor-liquid equilibria were estimated for binary CO 2 mixtures with 16 different ionic liquids (ILs) of the 1-alkyl-3-methylimidazolium family with various anions consisting of bis(trifluoromethylsulfonyl) imide, hexafluorophosphate, tetrafluoroborate, and trifluoromethanesulfonate. For a comprehensive comparison, three different modes (different adjustable parameters) of the proposed model were compared with the conventional PC-SAFT. Results indicate that the proposed modification of the PC-SAFT EoS is generally more reliable with respect to the conventional PC-SAFT in all the three proposed modes of vapor-liquid equilibria, giving good agreement with literature data.

  4. Practical Statistics for Environmental and Biological Scientists

    CERN Document Server

    Townend, John

    2012-01-01

    All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp

  5. A Blended Learning Experience in Statistics for Psychology Students Using the Evaluation as a Learning Tool

    Directory of Open Access Journals (Sweden)

    Alberto VALENTÍN CENTENO

    2016-05-01

    Full Text Available Teaching statistics course Applied Psychology, was based on different teaching models that incorporate active teaching methodologies. In this experience have combined approaches that prioritize the use of ICT with other where evaluation becomes an element of learning. This has involved the use of virtual platforms to support teaching that facilitate learning and activities where no face-to-face are combined. The design of the components of the course is inspired by the dimensions proposed by Carless (2003 model. This model uses evaluation as a learning element. The development of this experience has shown how the didactic proposal has been positively interpreted by students. Students recognized that they had to learn and deeply understand the basic concepts of the subject, so that they can teach and assess their peers.

  6. Linear kinetic theory and particle transport in stochastic mixtures

    International Nuclear Information System (INIS)

    Pomraning, G.C.

    1994-03-01

    The primary goal in this research is to develop a comprehensive theory of linear transport/kinetic theory in a stochastic mixture of solids and immiscible fluids. The statistics considered correspond to N-state discrete random variables for the interaction coefficients and sources, with N denoting the number of components of the mixture. The mixing statistics studied are Markovian as well as more general statistics, such as renewal processes. A further goal of this work is to demonstrate the applicability of the formalism to real world engineering problems. This three year program was initiated June 15, 1993 and has been underway nine months. Many significant results have been obtained, both in the formalism development and in representative applications. These results are summarized by listing the archival publications resulting from this grant, including the abstracts taken directly from the papers

  7. Designing experiments on thermal interactions by secondary-school students in a simulated laboratory environment

    Science.gov (United States)

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-07-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and communication technology, and specifically of the simulated virtual laboratory 'ThermoLab'. Design and methods A pre-post comparison was applied. Students' design of experiments was rated in eight dimensions; namely, hypothesis forming and verification, selection of variables, initial conditions, device settings, materials and devices used, process and phenomena description. A three-level ranking scheme was employed for the evaluation of students' answers in each dimension. Results A Wilcoxon signed-rank test revealed a statistically significant difference between the students' pre- and post-test scores. Additional analysis by comparing the pre- and post-test scores using the Hake gain showed high gains in all but one dimension, which suggests that this improvement was almost inclusive. Conclusions We consider that our findings support the statement that there was an improvement in students' ability to design experiments.

  8. Experimental design of a waste glass study

    International Nuclear Information System (INIS)

    Piepel, G.F.; Redgate, P.E.; Hrma, P.

    1995-04-01

    A Composition Variation Study (CVS) is being performed to support a future high-level waste glass plant at Hanford. A total of 147 glasses, covering a broad region of compositions melting at approximately 1150 degrees C, were tested in five statistically designed experimental phases. This paper focuses on the goals, strategies, and techniques used in designing the five phases. The overall strategy was to investigate glass compositions on the boundary and interior of an experimental region defined by single- component, multiple-component, and property constraints. Statistical optimal experimental design techniques were used to cover various subregions of the experimental region in each phase. Empirical mixture models for glass properties (as functions of glass composition) from previous phases wee used in designing subsequent CVS phases

  9. Statistical literacy for clinical practitioners

    CERN Document Server

    Holmes, William H

    2014-01-01

    This textbook on statistics is written for students in medicine, epidemiology, and public health. It builds on the important role evidence-based medicine now plays in the clinical practice of physicians, physician assistants and allied health practitioners. By bringing research design and statistics to the fore, this book can integrate these skills into the curricula of professional programs. Students, particularly practitioners-in-training, will learn statistical skills that are required of today’s clinicians. Practice problems at the end of each chapter and downloadable data sets provided by the authors ensure readers get practical experience that they can then apply to their own work.  Topics covered include:   Functions of Statistics in Clinical Research Common Study Designs Describing Distributions of Categorical and Quantitative Variables Confidence Intervals and Hypothesis Testing Documenting Relationships in Categorical and Quantitative Data Assessing Screening and Diagnostic Tests Comparing Mean...

  10. Minimizing hospital losses in x-ray films using design of experiments

    International Nuclear Information System (INIS)

    Aljohani, M.S.; Moreb, A.A.; Naser Al Qasabi; Bandar Sobahi

    2004-01-01

    Exposure of patients to excessive radiation increases an unnecessary risk of cancer. An X-ray film is considered a reject and is to be repeated if the contrast between the image (picture) of the organ and the films background is low. Applying a two stages approach of Design of Experiments (DOE), this paper considers the factors affecting the contrast of the X-ray film, sorts out the most influential factors and identifies the optimum settings for each factor. Furthermore, the contribution of each factor on the optimal contrast is statistically identified. (Author)

  11. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  12. Exposure to Pb, Cd, and As mixtures potentiates the production of oxidative stress precursors: 30-day, 90-day, and 180-day drinking water studies in rats.

    Science.gov (United States)

    Whittaker, Margaret H; Wang, Gensheng; Chen, Xue-Qing; Lipsky, Michael; Smith, Donald; Gwiazda, Roberto; Fowler, Bruce A

    2011-07-15

    Exposure to chemical mixtures is a common and important determinant of toxicity and is of particular concern due to their appearance in sources of drinking water. Despite this, few in vivo mixture studies have been conducted to date to understand the health impact of chemical mixtures compared to single chemicals. Interactive effects of lead (Pb), cadmium (Cd) and arsenic (As) were evaluated in 30-, 90-, and 180-day factorial design drinking water studies in rats designed to test the hypothesis that ingestion of such mixtures at individual component Lowest-Observed-Effect-Levels (LOELs) results in increased levels of the pro-oxidant delta aminolevulinic acid (ALA), iron, and copper. LOEL levels of Pb, Cd, and As mixtures resulted in the increased presence of mediators of oxidative stress such as ALA, copper, and iron. ALA increases were followed by statistically significant increases in kidney copper in the 90- and 180-day studies. Statistical evidence of interaction was identified for six biologically relevant variables: blood delta aminolevulinic acid dehydratase (ALAD), kidney ALAD, urinary ALA, urinary iron, kidney iron, and kidney copper. The current investigations underscore the importance of considering interactive effects that common toxic agents such as Pb, Cd, and As may have upon one another at low-dose levels. The interactions between known toxic trace elements at biologically relevant concentrations shown here demonstrate a clear need to rigorously review methods by which national/international agencies assess health risks of chemicals, since exposures may commonly occur as complex mixtures. Copyright © 2011. Published by Elsevier Inc.

  13. Permanence of diced cartilage, bone dust and diced cartilage/bone dust mixture in experimental design in twelve weeks.

    Science.gov (United States)

    Islamoglu, Kemal; Dikici, Mustafa Bahadir; Ozgentas, Halil Ege

    2006-09-01

    Bone dust and diced cartilage are used for contour restoration because their minimal donor site morbidity. The purpose of this study is to investigate permanence of bone dust, diced cartilage and bone dust/diced cartilage mixture in rabbits over 12 weeks. New Zealand white rabbits were used for this study. There were three groups in the study: Group I: 1 mL bone dust. Group II: 1 mL diced cartilage. Group III: 0.5 mL bone dust + 0.5 mL diced cartilage mixture. They were placed into subcutaneous tissue of rabbits and removed 12 weeks later. The mean volumes of groups were 0.23 +/- 0.08 mL in group I, 0.60 +/- 0.12 mL in group II and 0.36 +/- 0.10 mL in group III. The differences between groups were found statistically significant. In conclusion, diced cartilage was found more reliable than bone dust aspect of preserving its volume for a long period in this study.

  14. Quantitative Characterization of the Toxicities of Cd-Ni and Cd-Cr Binary Mixtures Using Combination Index Method

    Directory of Open Access Journals (Sweden)

    Lingyun Mo

    2016-01-01

    Full Text Available Direct equipartition ray design was used to construct Cd-Ni and Cd-Cr binary mixtures. Microplate toxicity analysis was used to evaluate the toxicity of individual substance and the Cd-Ni and Cd-Cr mixtures on Chlorella pyrenoidosa and Selenastrum capricornutum. The interacting toxicity of the mixture was analyzed with concentration addition (CA model. In addition, combination index method (CI was proposed and used to quantitatively characterize the toxicity of the binary mixtures of Cd-Ni and Cd-Cr observed in experiment and find the degree of deviation from the predicted outcome of the CA model, that is, the intensity of interacting toxicity. Results indicate that most of the 20 binary mixtures exhibit enhancing and synergistic effect, and only Cd-Cr-R4 and Cd-Cr-R5 mixtures have relatively high antagonistic effects against C. pyrenoidosa. Based on confidence interval, CI can compare the intensities of interaction of the mixtures under varying levels of effect. The characterization methods are applicable for analyzing binary mixture with complex interaction.

  15. User Experience Design (UX Design) in a Website Development : Website redesign

    OpenAIRE

    Orlova, Mariia

    2016-01-01

    The purpose of the study was to implement an approach of user experience for a website design. Mostly, I concentrated on revealing and understanding the concepts of UX design which include usability, visual design and human factors affecting the user experience. Another aim of the study was to investigate people’s behaviour related to web design. The thesis based on a project. The project was to redesign an existing web design for a company called Positive Communications. They provide differe...

  16. Resourcing of Experience in Co-Design

    DEFF Research Database (Denmark)

    Ylirisku, Salu; Revsbæk, Line; Buur, Jacob

    2017-01-01

    , knowledge to benefit its cultivation is expected to be highly valuable in contemporary multi-cultural design work. This paper approaches the study of the involvement of various stakeholders in design projects through a lens of resourcing experience. Building from G. H. Mead’s pragmatist theory, we devise...... and Scandinavia. By identifying ways in which experience is resourced in specific design interactions, the paper illustrates resourcing to be responsive, conceptual and habitual. The paper concludes by pinpointing strategic means that design teams may use in order to enable rich involvement and resourcing...

  17. Generation of two-dimensional binary mixtures in complex plasmas

    Science.gov (United States)

    Wieben, Frank; Block, Dietmar

    2016-10-01

    Complex plasmas are an excellent model system for strong coupling phenomena. Under certain conditions the dust particles immersed into the plasma form crystals which can be analyzed in terms of structure and dynamics. Previous experiments focussed mostly on monodisperse particle systems whereas dusty plasmas in nature and technology are polydisperse. Thus, a first and important step towards experiments in polydisperse systems are binary mixtures. Recent experiments on binary mixtures under microgravity conditions observed a phase separation of particle species with different radii even for small size disparities. This contradicts several numerical studies of 2D binary mixtures. Therefore, dedicated experiments are required to gain more insight into the physics of polydisperse systems. In this contribution first ground based experiments on two-dimensional binary mixtures are presented. Particular attention is paid to the requirements for the generation of such systems which involve the consideration of the temporal evolution of the particle properties. Furthermore, the structure of these two-component crystals is analyzed and compared to simulations. This work was supported by the Deutsche Forschungsgemeinschaft DFG in the framework of the SFB TR24 Greifswald Kiel, Project A3b.

  18. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    Science.gov (United States)

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  19. Scientific, statistical, practical, and regulatory considerations in design space development.

    Science.gov (United States)

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  20. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  1. Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crum, Jarrod V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-24

    This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO3, has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer

  2. Concrete mixture characterization. Cementitious barriers partnership

    Energy Technology Data Exchange (ETDEWEB)

    Langton, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Protiere, Yannick [SIMCO Technologies, Inc., Quebec (Canada)

    2014-12-01

    This report summarizes the characterization study performed on two concrete mixtures used for radioactive waste storage. Both mixtures were prepared with approximately 425 kg of binder. The testing protocol mostly focused on determining the transport properties of the mixtures; volume of permeable voids (porosity), diffusion coefficients, and water permeability were evaluated. Tests were performed after different curing durations. In order to obtain data on the statistical distribution of transport properties, the measurements after 2 years of curing were performed on 10+ samples. Overall, both mixtures exhibited very low tortuosities and permeabilities, a direct consequence of their low water-to-binder ratio and the use of supplementary cementitious materials. The data generated on 2-year old samples showed that porosity, tortuosity and permeability follow a normal distribution. Chloride ponding tests were also performed on test samples. They showed limited chloride ingress, in line with measured transport properties. These test results also showed that both materials react differently with chloride, a consequence of the differences in the binder chemical compositions.

  3. Characterize Behaviour of Emerging Pollutants in Artificial Recharge: Column Experiments - Experiment Design and Results of Preliminary Tests

    Science.gov (United States)

    Wang, H.; Carrera, J.; Ayora, C.; Licha, T.

    2012-04-01

    Emerging pollutants (EPs) have been detected in water resources as a result of human activities in recent years. They include pharmaceuticals, personal care products, dioxins, flame retardants, etc. They are a source of concern because many of them are resistant to conventional water treatment, and they are harmful to human health, even in low concentrations. Generally, this study aims to characterize the behaviour of emerging pollutants in reclaimed water in column experiments which simulates artificial recharge. One column set includes three parts: influent, reactive layer column (RLC) and aquifer column (AC). The main influent is decided to be Secondary Effluent (SE) of El Prat Wastewater Treatment Plant, Barcelona. The flow rate of the column experiment is 0.9-1.5 mL/min. the residence time of RLC is designed to be about 1 day and 30-40 days for AC. Both columns are made of stainless steel. Reactive layer column (DI 10cm * L55cm) is named after the filling material which is a mixture of organic substrate, clay and goethite. One purpose of the application of the mixture is to increase dissolve organic carbon (DOC). Leaching test in batchs and columns has been done to select proper organic substrate. As a result, compost was selected due to its long lasting of releasing organic matter (OM). The other purpose of the application of the mixture is to enhance adsorption of EPs. Partition coefficients (Kow) of EPs indicate the ability of adsorption to OM. EPs with logKow>2 could be adsorbed to OM, like Ibuprofen, Bezafibrate and Diclofenac. Moreover, some of EPs are charged in the solution with pH=7, according to its acid dissociation constant (Ka). Positively charged EPs, for example Atenolol, could adsorb to clay. In the opposite, negatively charged EPs, for example Gemfibrozil, could adsorb to goethite. Aquifer column (DI 35cm * L1.5m) is to simulate the processes taking place in aquifer in artificial recharge. The filling of AC has two parts: silica sand and

  4. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  5. Excess Properties of Aqueous Mixtures of Methanol: Simple Models Versus Experiment

    Czech Academy of Sciences Publication Activity Database

    Vlček, Lukáš; Nezbeda, Ivo

    roč. 131-132, - (2007), s. 158-162 ISSN 0167-7322. [International Conference on Solution Chemistry /29./. Portorož, 21.08.2005-25.08.2005] R&D Projects: GA AV ČR(CZ) IAA4072303; GA AV ČR(CZ) 1ET400720409 Institutional research plan: CEZ:AV0Z40720504 Keywords : aqueous mixtures * primitive models * water-alcohol mixtures Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 0.982, year: 2007

  6. Taguchi Method for Development of Mass Flow Rate Correlation using Hydrocarbon Refrigerant Mixture in Capillary Tube

    Directory of Open Access Journals (Sweden)

    Shodiya Sulaimon

    2014-07-01

    Full Text Available The capillary tube is an important control device used in small vapor compression refrigeration systems such as window air-conditioners, household refrigerators and freezers. This paper develops a non-dimensional correlation based on the test results of the adiabatic capillary tube for the mass flow rate through the tube using a hydrocarbon refrigerant mixture of 89.3% propane and 10.7% butane (HCM. The Taguchi method, a statistical experimental design approach, was employed. This approach explores the economic benefit that lies in studies of this nature, where only a small number of experiments are required and yet valid results are obtained. Considering the effects of the capillary tube geometry and the inlet condition of the tube, dimensionless parameters were chosen. The new correlation was also based on the Buckingham Pi theorem. This correlation predicts 86.67% of the present experimental data within a relative deviation of -10% to +10%. The predictions by this correlation were also compared with results in published literature.

  7. Designing Technology for Active Spectator Experiences at Sporting Events

    DEFF Research Database (Denmark)

    Veerasawmy, Rune; Ludvigsen, Martin

    2010-01-01

    This paper explores the active spectator experience at sporting events, by presenting and reflecting upon a design experiment carried out at a number of football1 events. The initial hypothesis of the design process, leading to the design experiment has been that the spectator experience is not m......This paper explores the active spectator experience at sporting events, by presenting and reflecting upon a design experiment carried out at a number of football1 events. The initial hypothesis of the design process, leading to the design experiment has been that the spectator experience...... is not merely an experience of receiving and consuming entertainment. It is also heavily reliant on the active participation of the spectator in creating the atmosphere of the entire event. The BannerBattle experiment provides interactive technology in sport arenas with a form of interaction based on existing...

  8. A primer of statistical methods for correlating parameters and properties of electrospun poly(l -lactide) scaffolds for tissue engineering-PART 1: Design of experiments

    KAUST Repository

    Seyedmahmoud, Rasoul

    2014-03-20

    Tissue engineering scaffolds produced by electrospinning are of enormous interest, but still lack a true understanding about the fundamental connection between the outstanding functional properties, the architecture, the mechanical properties, and the process parameters. Fragmentary results from several parametric studies only render some partial insights that are hard to compare and generally miss the role of parameters interactions. To bridge this gap, this article (Part-1 of 2) features a case study on poly-l-lactide scaffolds to demonstrate how statistical methods such as design of experiments can quantitatively identify the correlations existing between key scaffold properties and control parameters, in a systematic, consistent, and comprehensive manner disentangling main effects from interactions. The morphological properties (i.e., fiber distribution and porosity) and mechanical properties (Young\\'s modulus) are "charted" as a function of molecular weight (MW) and other electrospinning process parameters (the Xs), considering the single effect as well as interactions between Xs. For the first time, the major role of the MW emerges clearly in controlling all scaffold properties. The correlation between mechanical and morphological properties is also addressed.

  9. Design of experiments in production engineering

    CERN Document Server

    2016-01-01

    This book covers design of experiments (DoE) applied in production engineering as a combination of manufacturing technology with applied management science. It presents recent research advances and applications of design experiments in production engineering and the chapters cover metal cutting tools, soft computing for modelling and optmization of machining, waterjet machining of high performance ceramics, among others.

  10. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  11. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  12. Thermal signature measurements for ammonium nitrate/fuel mixtures by laser heating

    International Nuclear Information System (INIS)

    Nazarian, Ashot; Presser, Cary

    2016-01-01

    Highlights: • LDTR is a useful diagnostic for characterizing AN/fuel mixture thermochemical behavior. • Each AN/fuel mixture thermal signature was different. • AN/fuel mixture signature features were defined by the individual constituents. • Baseline signatures changed after an experiment. - Abstract: Measurements were carried out to obtain thermal signatures of several ammonium nitrate/fuel (ANF) mixtures, using a laser-heating technique referred to as the laser-driven thermal reactor (LDTR). The mixtures were ammonium nitrate (AN)/kerosene, AN/ethylene glycol, AN/paraffin wax, AN/petroleum jelly, AN/confectioner's sugar, AN/cellulose (tissue paper), nitromethane/cellulose, nitrobenzene/cellulose, AN/cellulose/nitromethane, AN/cellulose/nitrobenzene. These mixtures were also compared with AN/nitromethane and AN/diesel fuel oil, obtained from an earlier investigation. Thermograms for the mixtures, as well as individual constituents, were compared to better understand how sample thermal signature changes with mixture composition. This is the first step in development of a thermal-signature database, to be used along with other signature databases, to improve identification of energetic substances of unknown composition. The results indicated that each individual thermal signature was associated unambiguously with a particular mixture composition. The signature features of a particular mixture were shaped by the individual constituent signatures. It was also uncovered that the baseline signature was modified after an experiment due to coating of unreacted residue on the substrate surface and a change in the reactor sphere oxide layer. Thus, care was required to pre-oxidize the sphere prior to an experiment. A minimum sample mass (which was dependent on composition) was required to detect the signature characteristics. Increased laser power served to magnify signal strength while preserving the signature features. For the mixtures examined, the thermal

  13. Determining a Robust D-Optimal Design for Testing for Departure from Additivity in a Mixture of Four Perfluoroalkyl Acids.

    Science.gov (United States)

    Our objective is to determine an optimal experimental design for a mixture of perfluoroalkyl acids (PFAAs) that is robust to the assumption of additivity. PFAAs are widely used in consumer products and industrial applications. The presence and persistence of PFAAs, especially in ...

  14. Design of Experiments : An Overview

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2008-01-01

    Design Of Experiments (DOE) is needed for experiments with real-life systems, and with either deterministic or random simulation models. This contribution discusses the different types of DOE for these three domains, but focusses on random simulation. DOE may have two goals: sensitivity analysis

  15. Statistic analyses of the color experience according to the age of the observer.

    Science.gov (United States)

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).

  16. Kinetic-sound propagation in dilute gas mixtures

    International Nuclear Information System (INIS)

    Campa, A.; Cohen, E.G.D.

    1989-01-01

    Kinetic sound is predicted in dilute disparate-mass binary gas mixtures, propagating exclusively in the light compound and much faster than ordinary sound. It should be detectable by light-scattering experiments, as an extended shoulder in the scattering cross section for large frequencies. As an example, H 2 -Ar mixtures are discussed

  17. High-temperature hydrogen-air-steam detonation experiments in the BNL small-scale development apparatus

    International Nuclear Information System (INIS)

    Ciccarelli, G.; Ginsberg, T.; Boccio, J.; Economos, C.; Finfrock, C.; Gerlach, L.; Sato, K.

    1994-01-01

    The Small-Scale Development Apparatus (SSDA) was constructed to provide a preliminary set of experimental data to characterize the effect of temperature on the ability of hydrogen-air-steam-mixtures to undergo detonations and, equally important, to support design of the larger-scale High-Temperature Combustion Facility (HTCF) by providing a test bed for solution of a number of high-temperature design and operational problems. The SSDA, the central element of which is 10-cm inside diameter, 6.1-m long tubular test vessel designed to permit detonation experiments at temperatures up to 700K, was employed to study self-sustained detonations in gaseous mixtures of hydrogen, air, and steam at temperature between 300K and 650K at a fixed pressure of 0.1 MPa. Detonation cell size measurements provide clear evidence that the effect of hydrogen-air gas mixture temperature, in the range 300K to 650K, is to decrease cell size and, hence, to increase the sensitivity of the mixture to undergo detonations. The effect of steam content, at any given temperature, is to increase the cell size and, thereby, to decrease the sensitivity of stoichiometric hydrogen-air mixtures. The one-dimensional ZND model does a very good job at predicting the overall trends in the cell size data over the range of hydrogen-air-steam mixture compositions and temperature studied in the experiments. Experiments were conducted to measure the rate of hydrogen oxidation in the absence of ignition sources at temperatures of 500K and 650K, for hydrogen-air mixtures of 15% and 50%, and for a mixture of equimolar hydrogen-air and 30% steam at 650K. The rate of hydrogen oxidation was found to be significant at 650K. Reduction of hydrogen concentration by chemical reaction from 50 to 44% hydrogen, and from 15 to 11% hydrogen, were observed on a time frame of minutes. The DeSoete rate equation predicts the 50% experiment very well, but greatly underestimates the reaction rate of the lean mixtures

  18. The Quantitative Resolution of a Mixture of Group II Metal Ions by Thermometric Titration with EDTA. An Analytical Chemistry Experiment.

    Science.gov (United States)

    Smith, Robert L.; Popham, Ronald E.

    1983-01-01

    Presents an experiment in thermometric titration used in an analytic chemistry-chemical instrumentation course, consisting of two titrations, one a mixture of calcium and magnesium, the other of calcium, magnesium, and barium ions. Provides equipment and solutions list/specifications, graphs, and discussion of results. (JM)

  19. Statistical evaluation of design-error related nuclear reactor accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1981-01-01

    In this paper, general methodology for the statistical evaluation of design-error related accidents is proposed that can be applied to a variety of systems that evolves during the development of large-scale technologies. The evaluation aims at an estimate of the combined ''residual'' frequency of yet unknown types of accidents ''lurking'' in a certain technological system. A special categorization in incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of U.S. nuclear power reactor technology, considering serious accidents (category 2 events) that involved, in the accident progression, a particular design inadequacy. 9 refs

  20. Bioconversions of Palm Kernel Cake and Rice Bran Mixtures by Trichoderma viride Toward Nutritional Contents

    OpenAIRE

    Yana Sukaryana; Umi Atmomarsono; Vitus D. Yunianto; Ejeng Supriyatna

    2010-01-01

    The objective of the research is to examine the mixtures of palm kernel cake and rice bran of fermented by Trichoderma viride. Completely randomized design in factorial pattern 4 x 4 was used in this experiment. factor I is the doses of inoculums; D1 = 0%, D2 =  0,1% , D3 =  0,2%, D4 =  0,3%, and  complement factor II is mixtures of palm kernel cake and rice bran : T1=20:80% ; T2=40:60% ; T3=60:40% ; T4=80:20%. The treatment each of three replicate. Fermentation was conduc...

  1. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  2. Experiment Design and Analysis Guide - Neutronics & Physics

    Energy Technology Data Exchange (ETDEWEB)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  3. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  4. The use of D-optimal mixture design in optimising okara soap formulation for stratum corneum application.

    Science.gov (United States)

    Borhan, Farrah Payyadhah; Abd Gani, Siti Salwa; Shamsuddin, Rosnah

    2014-01-01

    Okara, soybean waste from tofu and soymilk production, was utilised as a natural antioxidant in soap formulation for stratum corneum application. D-optimal mixture design was employed to investigate the influence of the main compositions of okara soap containing different fatty acid and oils (virgin coconut oil A (24-28% w/w), olive oil B (15-20% w/w), palm oil C (6-10% w/w), castor oil D (15-20% w/w), cocoa butter E (6-10% w/w), and okara F (2-7% w/w)) by saponification process on the response hardness of the soap. The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for okara soap hardness in terms of the six design factors considered in this study. Results revealed that the best mixture was the formulation that included 26.537% A, 19.999% B, 9.998% C, 16.241% D, 7.633% E, and 7.000% F. The results proved that the difference in the level of fatty acid and oils in the formulation significantly affects the hardness of soap. Depending on the desirable level of those six variables, creation of okara based soap with desirable properties better than those of commercial ones is possible.

  5. Structure properties of the 3He-4He mixture at T = O K

    International Nuclear Information System (INIS)

    Boronat, J.; Polls, A.; Fabrocini, A.

    1993-01-01

    The spatial structure properties of 3 He- 4 He mixtures at T = O K are investigated using the hypernetted-chain formalism. The variational wave function used to describe the ground-state of the mixture is a simple generalization of the trial wave functions for pure phases and contains two- and three-body correlations. The elementary diagrams are taken into account by means of an extension of the scaling approximation to the mixtures. The two-body distribution (g (α,β) (r)) and the structure functions (S (α,β) (k)) together with the different spin-spin distribution functions for the 3 He component in the mixture are analyzed for several concentrations of 3 He. Two sum-rules, for the direct and the exchange part of the g (3,3) (r), are used to ascertain the importance of the full treatment of the Fermi statistics in the calculation. The statistical correlations are found responsible for the main differences between the several components of the distribution function. Due to its low concentration, 3 He behaves as a quasi-free Fermi gas, as far as the statistical correlations are concerned, although it is strongly correlated with the 4 He atoms through the interatomic potential

  6. DESIGN OF EXPERIMENTS IN TRUCK COMPANY

    Directory of Open Access Journals (Sweden)

    Bibiana Kaselyova

    2015-07-01

    Full Text Available Purpose: Design of experiment (DOE represent very powerful tool for process improvement vastly supported by six sigma methodology. This approach is mostly used by large and manufacturing orientated companies. Presented research is focused on use of DOE in truck company, which is medium size and service orientated. Such study has several purposes. Firstly, detailed description of improvement effort based on DOE can be used as a methodological framework for companies similar to researched one. Secondly, it provides example of successfully implemented low cost design of experiment practise. Moreover, performed experiment identifies key factors, which influence the lifetime of truck tyres.Design/methodology: The research in this paper is based on experiment conducted in Slovakian Truck Company. It provides detailed case study of whole improvement effort, together with problem formulation, design creation and analysis, as well as the results interpretation. The company wants to improve lifetime of the truck tyres. Next to fuel consumption, consumption of tyres and their replacement represent according to them, one of most costly processes in company. Improvement effort was made through the use of PDCA cycle. It start with analysis of current state of tyres consumption. The variability of tyres consumption based on years and types was investigated. Then the causes of tyres replacement were identified and screening DOE was conducted. After a screening design, the full factorial design of experiment was used to identify main drivers of tyres deterioration and breakdowns. Based on result of DOE, the corrective action were propose and implement.Findings: Based on performed experiment our research describes process of tyres use and replacement. It defines main reasons for tyre breakdown and identify main drivers which influence truck tyres lifetime. Moreover it formulates corrective action to prolong tyres lifetime.Originality: The study represents full

  7. Effect of microemulsions on transdermal delivery of citalopram: optimization studies using mixture design and response surface methodology

    Directory of Open Access Journals (Sweden)

    Huang CT

    2013-06-01

    Full Text Available Chi-Te Huang,1 Ming-Jun Tsai,2,3 Yu-Hsuan Lin,1 Yaw-Sya Fu,4 Yaw-Bin Huang,5 Yi-Hung Tsai,5 Pao-Chu Wu11School of Pharmacy, Kaohsiung Medical University, Kaohsiung City, 2Department of Neurology, China Medical University Hospital, Taichung, 3School of Medicine, Medical College, China Medical University, Taichung, 4Faculty of Biomedical Science and Environmental Biology, 5Graduate Institute of Clinical Pharmacy, Kaohsiung Medical University, Kaohsiung City, Taiwan, Republic of ChinaAbstract: The aim of this study was to evaluate the potential of microemulsions as a drug vehicle for transdermal delivery of citalopram. A computerized statistical technique of response surface methodology with mixture design was used to investigate and optimize the influence of the formulation compositions including a mixture of Brij 30/Brij 35 surfactants (at a ratio of 4:1, 20%–30%, isopropyl alcohol (20%–30%, and distilled water (40%–50% on the properties of the drug-loaded microemulsions, including permeation rate (flux and lag time. When microemulsions were used as a vehicle, the drug permeation rate increased significantly and the lag time shortened significantly when compared with the aqueous control of 40% isopropyl alcohol solution containing 3% citalopram, demonstrating that microemulsions are a promising vehicle for transdermal application. With regard to the pharmacokinetic parameters of citalopram, the flux required for the transdermal delivery system was about 1280 µg per hour. The microemulsions loaded with citalopram 3% and 10% showed respective flux rates of 179.6 µg/cm2 and 513.8 µg/cm2 per hour, indicating that the study formulation could provide effective therapeutic concentrations over a practical application area. The animal study showed that the optimized formulation (F15 containing 3% citalopram with an application area of 3.46 cm2 is able to reach a minimum effective therapeutic concentration with no erythematous reaction

  8. Estimating the number of sources in a noisy convolutive mixture using BIC

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard; Hansen, Lars Kai

    2004-01-01

    The number of source signals in a noisy convolutive mixture is determined based on the exact log-likelihoods of the candidate models. In (Olsson and Hansen, 2004), a novel probabilistic blind source separator was introduced that is based solely on the time-varying second-order statistics of the s......The number of source signals in a noisy convolutive mixture is determined based on the exact log-likelihoods of the candidate models. In (Olsson and Hansen, 2004), a novel probabilistic blind source separator was introduced that is based solely on the time-varying second-order statistics...

  9. Taguchi Method for Development of Mass Flow Rate Correlation Using Hydrocarbon Refrigerant Mixture in Capillary Tube

    OpenAIRE

    Sulaimon, Shodiya; Nasution, Henry; Aziz, Azhar Abdul; Abdul-Rahman, Abdul-Halim; Darus, Amer N

    2014-01-01

    The capillary tube is an important control device used in small vapor compression refrigeration systems such as window air-conditioners, household refrigerators and freezers. This paper develops a non-dimensional correlation based on the test results of the adiabatic capillary tube for the mass flow rate through the tube using a hydrocarbon refrigerant mixture of 89.3% propane and 10.7% butane (HCM). The Taguchi method, a statistical experimental design approach, was employed. This approach e...

  10. Phase Equilibrium Calculations for Multi-Component Polar Fluid Mixtures with tPC-PSAFT

    DEFF Research Database (Denmark)

    Karakatsani, Eirini; Economou, Ioannis

    2007-01-01

    The truncated Perturbed-Chain Polar Statistical Associating Fluid Theory (tPC-PSAFT) is applied to a number of different mixtures, including binary, ternary and quaternary mixtures of components that differ substantially in terms of intermolecular interactions and molecular size. In contrast to m...

  11. Mixtures of skewed Kalman filters

    KAUST Repository

    Kim, Hyoungmoon

    2014-01-01

    Normal state-space models are prevalent, but to increase the applicability of the Kalman filter, we propose mixtures of skewed, and extended skewed, Kalman filters. To do so, the closed skew-normal distribution is extended to a scale mixture class of closed skew-normal distributions. Some basic properties are derived and a class of closed skew. t distributions is obtained. Our suggested family of distributions is skewed and has heavy tails too, so it is appropriate for robust analysis. Our proposed special sequential Monte Carlo methods use a random mixture of the closed skew-normal distributions to approximate a target distribution. Hence it is possible to handle skewed and heavy tailed data simultaneously. These methods are illustrated with numerical experiments. © 2013 Elsevier Inc.

  12. Chemometrics as a tool to analyse complex chemical mixtures

    DEFF Research Database (Denmark)

    Christensen, J. H.

    Chemical characterisation of contaminant mixtures is important for environmental forensics and risk assessment. The great challenge in future research lies in develop- ing suitable, rapid, reliable and objective methods for analysis of the composition of complex chemical mixtures. This thesis...... describes the development of such methods for assessing the identity (chemical fingerprinting) and fate (e.g. biodegradation) of petroleum hydrocarbon mixtures. The methods comply with the general concept that suitable methods must be rapid and inexpensive, objective with limited human in- tervention...... and at the same time must consider a substantial fraction of compounds in the complex mixture. A combination of a) limited sample preparation, b) rapid chemical screening analysis, c) fast and semi-automatic pre-processing, d) compre- hensive multivariate statistical data analysis and e) objective data evaluation...

  13. Obtaining mathematical models for assessing efficiency of dust collectors using integrated system of analysis and data management STATISTICA Design of Experiments

    Science.gov (United States)

    Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.

    2018-05-01

    The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.

  14. Determination of thermal conductivity in foundry mould mixtures

    Directory of Open Access Journals (Sweden)

    G. Solenički

    2010-01-01

    Full Text Available For a thorough understanding of the behaviour of foundry mould mixtures, a good knowledge of thermal properties of mould materials is needed. Laboratory determination of thermal conductivity of mould mixtures enables a better control over scabbing defects which are a major problem in green sand mould mixtures. A special instrument has been designed for that purpose and it is described in this work.

  15. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  16. Self-sealing barriers of sand/bentonite-mixtures in a clay repository. SB-experiment in the Mont Terri Rock Laboratory. Final report

    International Nuclear Information System (INIS)

    Rothfuchs, Tilmann; Czaikowski, Oliver; Hartwig, Lothar; Hellwald, Karsten; Komischke, Michael; Miehe, Ruediger; Zhang, Chun-Liang

    2012-10-01

    Several years ago, GRS performed laboratory investigations on the suitability of clay/mineral mixtures as optimized sealing materials in underground repositories for radioactive wastes /JOC 00/ /MIE 03/. The investigations yielded promising results so that plans were developed for testing the sealing properties of those materials under representative in-situ conditions in the Mont Terri Rock Laboratory (MTRL). The project was proposed to the ''Projekttraeger Wassertechnologie und Entsorgung (PtWT+E)'', and finally launched in January 2003 under the name SB-project (''Self-sealing Barriers of Clay/Mineral Mixtures in a Clay Repository''). The project was divided in two parts, a pre-project running from January 2003 until June 2004 under contract No. 02E9713 /ROT 04/ and the main project running from January 2004 until June 2012 under contract No. 02E9894 with originally PtWT+E, later renamed as PTKA-WTE. In the course of the pre-project it was decided to incorporate the SB main project as a cost shared action of PtWT+E and the European Commission (contract No. FI6W-CT-2004-508851) into the EC Integrated Project ESDRED (Engineering Studies and Demonstrations of Repository Designs) performed by 11 European project partners within the 6th European framework programme. The ESDRED project was terminated prior to the termination of the SB project. Interim results were reported by mid 2009 in two ESDRED reports /DEB09/ /SEI 09/. This report presents the results achieved in the whole SB-project comprising preceding laboratory investigations for the final selection of suited material mixtures, the conduction of mock-up tests in the geotechnical laboratory of GRS in Braunschweig and the execution of in-situ experiments at the MTRL.

  17. Viscosities of corium-concrete mixtures

    International Nuclear Information System (INIS)

    Seiler, J.M.; Ganzhorn, J.

    1997-01-01

    Severe accidents on nuclear reactors involve many situations such as pools of molten core material, melt spreading, melt/concrete interactions, etc. The word 'corium' designates mixtures of materials issued from the molten core at high temperature; these mixtures involve mainly: UO2, ZrO2, Zr and, in small amounts, Ni, Cr, Ag, In, Cd. These materials, when flowing out of the reactor vessel, may interact with the concrete of the reactor building thus introducing decomposition products of concrete into the original mixture. These decomposition products are mainly: SiO 2 , FeO, MgO, CaO and Al 2 O 3 in different amounts depending on the nature of the concrete being considered. Siliceous concrete is rich in SiO 2 , limestone concrete contains both SiO 2 and CaO. Liquidus temperatures of such mixtures are generally obove 2300 K whereas solidus temperatures are ∝1400 K. (orig.)

  18. A RANS knock model to predict the statistical occurrence of engine knock

    International Nuclear Information System (INIS)

    D'Adamo, Alessandro; Breda, Sebastiano; Fontanesi, Stefano; Irimescu, Adrian; Merola, Simona Silvia; Tornatore, Cinzia

    2017-01-01

    Highlights: • Development of a new RANS model for SI engine knock probability. • Turbulence-derived transport equations for variances of mixture fraction and enthalpy. • Gasoline autoignition delay times calculated from detailed chemical kinetics. • Knock probability validated against experiments on optically accessible GDI unit. • PDF-based knock model accounting for the random nature of SI engine knock in RANS simulations. - Abstract: In the recent past engine knock emerged as one of the main limiting aspects for the achievement of higher efficiency targets in modern spark-ignition (SI) engines. To attain these requirements, engine operating points must be moved as close as possible to the onset of abnormal combustions, although the turbulent nature of flow field and SI combustion leads to possibly ample fluctuations between consecutive engine cycles. This forces engine designers to distance the target condition from its theoretical optimum in order to prevent abnormal combustion, which can potentially damage engine components because of few individual heavy-knocking cycles. A statistically based RANS knock model is presented in this study, whose aim is the prediction not only of the ensemble average knock occurrence, poorly meaningful in such a stochastic event, but also of a knock probability. The model is based on look-up tables of autoignition times from detailed chemistry, coupled with transport equations for the variance of mixture fraction and enthalpy. The transported perturbations around the ensemble average value are based on variable gradients and on a local turbulent time scale. A multi-variate cell-based Gaussian-PDF model is proposed for the unburnt mixture, resulting in a statistical distribution for the in-cell reaction rate. An average knock precursor and its variance are independently calculated and transported; this results in the prediction of an earliest knock probability preceding the ensemble average knock onset, as confirmed by

  19. A newly designed multichannel scaling system: Validated by Feynman-α experiment in EHWZPR

    Energy Technology Data Exchange (ETDEWEB)

    Arkani, Mohammad, E-mail: markani@aeoi.org.ir; Mataji-Kojouri, Naimeddin

    2016-08-15

    Highlights: • An embedded measuring system with enhanced operational capabilities is introduced to the scientists. • The design is low cost and reprogrammable. • The system design is dedicated to multi-detector experiments with huge data collection. • Non count loss effect Feynman-α experiment is performed in EHWZPR. • The results is compared with endogenous/inherent pulsed neutron source experiment. - Abstract: In this work, an embedded multi-input multi-million-channel MCS in a newly design is constructed for multi-detector experimental research applications. Important characteristics of the system are possible to be tuned based on experimental case studies utilizing the reprogrammable nature of the silicon. By means of differentiation of the integrated counts registered in memory, this system is featured as a zero channel advance time measuring tool ideal for experiments on time correlated random processes. Using this equipment, Feynman-α experiment is performed in Esfahan Heavy Water Zero Power Reactor (EHWZPR) utilizing three different in-core neutron detectors. One million channel data is collected by the system in 5 ms gate time from each neutron detector simultaneously. As heavy water moderated reactors are significantly slow systems, a huge number of data channels is required to be collected. Then, by making in use of bunching method, the data is analyzed and prompt neutron decay constant of the system is estimated for each neutron detector positioned in the core. The results are compared with the information provided by endogenous pulsed neutron source experiment and a good agreement is seen within the statistical uncertainties of the results. This equipment makes further research in depth possible in a range of stochastic experiments in nuclear physics such as cross correlation analysis of multi-detector experiments.

  20. A Study on The Mixture of Exponentiated-Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Adel Tawfik Elshahat

    2016-12-01

    Full Text Available Mixtures of measures or distributions occur frequently in the theory and applications of probability and statistics. In the simplest case it may, for example, be reasonable to assume that one is dealing with the mixture in given proportions of a finite number of normal populations with different means or variances. The mixture parameter may also be denumerable infinite, as in the theory of sums of a random number of random variables, or continuous, as in the compound Poisson distribution. The use of finite mixture distributions, to control for unobserved heterogeneity, has become increasingly popular among those estimating dynamic discrete choice models. One of the barriers to using mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive reparability of the log likelihood function. In this thesis, the maximum likelihood estimators have been obtained for the parameters of the mixture of exponentiated Weibull distribution when sample is available from censoring scheme. The maximum likelihood estimators of the parameters and the asymptotic variance covariance matrix have been also obtained. A numerical illustration for these new results is given.

  1. MECHANICAL BEHAVIOR OF COLD BITUMINOUS MIXTURE UNDER EFFECTS OF STATIC AND REPEATED LOADS1

    OpenAIRE

    Tamyres Karla da Silva; Carlos Alexandre Braz de Carvalho; Geraldo Luciano de Oliveira Marques; Dario Cardoso de Lima; Taciano Oliveira da Silva; Carlos Cardoso Machado

    2017-01-01

    Abstract This paper presents the results of an experimental research aimed at analyzing the mechanical behavior of a cold bituminous mixture under effects of static and repeated loads. Initially, a Marshall mixture design was performed to determine the mixture design contents according to standard DNER (1994a). After obtaining the mixture design contents, nine bituminous specimens were molded and subjected to the following tests: resilient modulus, tensile strength by diametral compression, a...

  2. Combustible gas production (methane) and biodegradation of solid and liquid mixtures of meat industry wastes

    Energy Technology Data Exchange (ETDEWEB)

    Marcos, A.; Al-Kassir, A.; Cuadros, F.; Lopez-Rodriguez, F. [School of Engineering, University of Extremadura, Avda. De Elva, s/n, 06071, Badajoz (Spain); Mohamad, A.A. [Department of Mechanical and Manufacturing Engineering, University of Calgary, 2500 University Dr. N.W., Calgary, Alberta (Canada)

    2010-05-15

    This work is devoted to determine the optimal operational conditions on the methane production as well as on the biodegradation obtained from the anaerobic codigestion of solid (fat, intestines, rumen, bowels, whiskers, etc.) and liquid (blood, washing water, manure, etc.) wastes of meat industry, particularly the ones rising from the municipal slaughterhouse of Badajoz (Spain). The experiments were performed using a 2 l capacity discontinuous digester at 38 C. The loading rate were 0.5, 1, 2, 3, and 4.5 g COD for wastewater (washing water and blood; Mixture 1), and 0.5, 1, 2, 3, and 4 g COD for the co-digestion of a mixture of 97% liquid effluent and 3% solid wastes v/v (Mixture 2) which represents the annual mean composition of the waste generated by the slaughterhouse. The maximal biodegradation rates obtained were: Mixture 1, 56.9% for a COD load of 1 g; and Mixture 2, 19.1% for a COD load of 2 g. For both mixtures, the greatest methane production was for the maximum COD load (4.5 g for Mixture 1, and 4 g for Mixture 2), at which values the amounts of methane obtained during and at the end of the co-digestion were practically indistinguishable between the two mixtures. The results will be used to design, construct, and establish the optimal operating conditions of a continuous complete-mixture biodigester. (author)

  3. Modeling the [NTf2] pyridinium ionic liquids family and their mixtures with the soft statistical associating fluid theory equation of state.

    Science.gov (United States)

    Oliveira, M B; Llovell, F; Coutinho, J A P; Vega, L F

    2012-08-02

    In this work, the soft statistical associating fluid theory (soft-SAFT) equation of state (EoS) has been used to provide an accurate thermodynamic characterization of the pyridinium-based family of ionic liquids (ILs) with the bis(trifluoromethylsulfonyl)imide anion [NTf(2)](-). On the basis of recent molecular simulation studies for this family, a simple molecular model was proposed within the soft-SAFT EoS framework. The chain length value was transferred from the equivalent imidazolium-based ILs family, while the dispersive energy and the molecular parameters describing the cation-anion interactions were set to constant values for all of the compounds. With these assumptions, an appropriate set of molecular parameters was found for each compound fitting to experimental temperature-density data at atmospheric pressure. Correlations for the nonconstant parameters (describing the volume of the IL) with the molecular weight were established, allowing the prediction of the parameters for other pyridiniums not included in the fitting. Then, the suitability of the proposed model and its optimized parameters were tested by predicting high-pressure densities and second-order thermodynamic derivative properties such as isothermal compressibilities of selected [NTf(2)] pyridinium ILs, in a large range of thermodynamic conditions. The surface tension was also provided using the density gradient theory coupled to the soft-SAFT equation. Finally, the soft-SAFT EoS was applied to describe the phase behavior of several binary mixtures of [NTf(2)] pyridinium ILs with carbon dioxide, sulfur dioxide, and water. In all cases, a temperature-independent binary parameter was enough to reach quantitative agreement with the experimental data. The description of the solubility of CO(2) in these ILs also allowed identification of a relation between the binary parameter and the molecular weight of the ionic liquid, allowing the prediction of the CO(2) + C(12)py[NTf(2)] mixture. The good

  4. Characterizing oxidative flow reactor SOA production and OH radical exposure from laboratory experiments of complex mixtures (engine exhaust) and simple precursors (monoterpenes)

    Science.gov (United States)

    Michael Link, M. L.; Friedman, B.; Ortega, J. V.; Son, J.; Kim, J.; Park, G.; Park, T.; Kim, K.; Lee, T.; Farmer, D.

    2016-12-01

    Recent commercialization of the Oxidative Flow Reactor (OFR, occasionally described in the literature as a "Potential Aerosol Mass") has created the opportunity for many researchers to explore the mechanisms behind OH-driven aerosol formation on a wide range of oxidative timescales (hours to weeks) in both laboratory and field measurements. These experiments have been conducted in both laboratory and field settings, including simple (i.e. single component) and complex (multi-component) precursors. Standard practices for performing OFR experiments, and interpreting data from the measurements, are still being developed. Measurement of gas and particle phase chemistry, from oxidation products generated in the OFR, through laboratory studies on single precursors and the measurement of SOA from vehicle emissions on short atmospheric timescales represent two very different experiments in which careful experimental design is essential for exploring reaction mechanisms and SOA yields. Two parameters essential in experimental design are (1) the role of seed aerosol in controlling gas-particle partitioning and SOA yields, and (2) the accurate determination of OH exposure during any one experiment. We investigated the role of seed aerosol surface area in controlling the observed SOA yields and gas/particle composition from the OH-initiated oxidation of four monoterpenes using an aerosol chemical ionization time-of-flight mass spectrometer and scanning mobility particle sizer. While the OH exposure during laboratory experiments is simple to constrain, complex mixtures such as diesel exhaust have high estimated OH reactivity values, and thus require careful consideration. We developed methods for constraining OH radical exposure in the OFR during vehicle exhaust oxidation experiments. We observe changes in O/C ratios and highly functionalized species over the temperature gradient employed in the aerosol-CIMS measurement. We relate this observed, speciated chemistry to the

  5. Biochemically Investigation of the Effects of Nettle Seed Herbal Mixture on Alcohol Damaged Liver

    Directory of Open Access Journals (Sweden)

    A. ÇELİK

    2014-06-01

    Full Text Available It was experimentally investigated in this research how protective Nettle Seed Herbal Mixture is against ethanol which causes oxidative stress in rats and causes toxic effects in the liver with chronic use. 20 4-month-old female Wistar male rats were used in the study. All rats in the study were fed with normal pellet Mouse food during the experiment. 10 week application was done by dividing the rats into four equal groups. Application method is orally drinking method. First group is the control group. The second group is the alcohol group. This group was given 30% ethanol in order to cause chronic alcoholisms. The third group was the alcohol+ Nettle Seed Herbal Mixture group and the rats in this group were given liquid, which was 30% ethanol,+ Nettle Seed Herbal Mixture extract. Fourth group was Nettle Seed Herbal Mixture extract group and the rats in this group were given liquid, which was Nettle Seed Herbal Mixture extract. At the end of ten weeks, within the first 24 hours, blood species were obtained from the animals under anesthesia using appropriate techniques. Serum ALT and AST values of the obtained blood samples were studied by enzymatic methods in "Roche Cobas 6000" device.. Biochemically ALT and AST enzyme values and statistical analysis with SPSS programe were done. No significant difference was found between these four groups at the end of the analysis because p value was bigger than 0,005.

  6. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  7. Evolutionary experience design – the case of Otopia

    DEFF Research Database (Denmark)

    Hansen, Kenneth

    experiences with the case of “Otopia”. “Otopia” is a large scale, new media experiment, which combines the areas of computer games, sports and performance in to a spectator oriented concept; it was premiered in a dome tent at the Roskilde Festival in Denmark the summer 2005. This paper presents and discusses......The design of experiences is a complicated challenge. It might not even be possible to design such a “thing”, but only to design for it. If this is the case it could seem appropriate with an evolutionary approach. This paper introduces such an approach to the design of new public oriented...... used as a means of specifying the basic immaterial design form. This discussion leads to the suggestion of a rule-based evolutionary model for the design of situations as a practical option for designers of new spectator oriented experiences in the future The project of Otopia was supported...

  8. Design of experiments and data analysis challenges in calibration for forensics applications

    International Nuclear Information System (INIS)

    Anderson-Cook, Christine M.; Burr, Thomas L.; Hamada, Michael S.; Ruggiero, Christy E.; Thomas, Edward V.

    2015-01-01

    Forensic science aims to infer characteristics of source terms using measured observables. Our focus is on statistical design of experiments and data analysis challenges arising in nuclear forensics. More specifically, we focus on inferring aspects of experimental conditions (of a process to produce product Pu oxide powder), such as temperature, nitric acid concentration, and Pu concentration, using measured features of the product Pu oxide powder. The measured features, Y, include trace chemical concentrations and particle morphology such as particle size and shape of the produced Pu oxide power particles. Making inferences about the nature of inputs X that were used to create nuclear materials having particular characteristics, Y, is an inverse problem. Therefore, statistical analysis can be used to identify the best set (or sets) of Xs for a new set of observed responses Y. One can fit a model (or models) such as Y = f(X) + error, for each of the responses, based on a calibration experiment and ''invert'' to solve for the best set of Xs for a new set of Ys. This perspectives paper uses archived experimental data to consider aspects of data collection and experiment design for the calibration data to maximize the quality of the predicted Ys in the forward models; that is, we assume that well-estimated forward models are effective in the inverse problem. In addition, we consider how to identify a best solution for the inferred X, and evaluate the quality of the result and its robustness to a variety of initial assumptions, and different correlation structures between the responses. In addition, we also briefly review recent advances in metrology issues related to characterizing particle morphology measurements used in the response vector, Y

  9. Content, Affective, and Behavioral Challenges to Learning: Students' Experiences Learning Statistics

    Science.gov (United States)

    McGrath, April L.

    2014-01-01

    This study examined the experiences of and challenges faced by students when completing a statistics course. As part of the requirement for this course, students completed a learning check-in, which consisted of an individual meeting with the instructor to discuss questions and the completion of a learning reflection and study plan. Forty…

  10. Statistical distribution of quantum particles

    Indian Academy of Sciences (India)

    S B Khasare

    2018-02-08

    Feb 8, 2018 ... In this work, the statistical distribution functions for boson, fermions and their mixtures have been ... index is greater than unity, then it is easy in the present approach to ... ability W. Section 3 gives the derivation and graphical.

  11. Storytelling tools in support of user experience design

    NARCIS (Netherlands)

    Peng, Qiong

    2017-01-01

    Storytelling has been proposed as an intuitive way to support communication in user experience design. With story-based thinking, designers can gain a better understanding of the potential user experience, developing and discussing design ideas within an (imagined) context. This proposal introduces

  12. Experiment design for identification of structured linear systems

    NARCIS (Netherlands)

    Potters, M.G.

    2016-01-01

    Experiment Design for system identification involves the design of an optimal input signal with the purpose of accurately estimating unknown parameters in a system. Specifically, in the Least-Costly Experiment Design (LCED) framework, the optimal input signal results from an optimisation problem in

  13. Design aspects of low activation fusion ignition experiments

    International Nuclear Information System (INIS)

    Cheng, E.T.; Creedon, R.L.; Hopkins, G.R.; Trester, P.W.; Wong, C.P.C.; Schultz, K.R.

    1986-01-01

    Preliminary design studies have been done exploring (1) materials selection, (2) shutdown biological dose rates, (3) mechanical design and (4) thermal design of a fusion ignition experiment made of low activation materials. From the results of these preliminary design studies it appears that an ignition experiment could be built of low activation materials, and that this design would allow hands-on access for maintenance

  14. Toxicity of binary mixtures of metals and pyrethroid insecticides to Daphnia magna Straus. Implications for multi-substance risks assessment

    Energy Technology Data Exchange (ETDEWEB)

    Barata, Carlos [Laboratory of Environmental Toxicology, Universitat Poltiecnica de Catalunya, CN 150 Km 14.5, Terrassa 08220 (Spain)]. E-mail: barata@intexter.upc.edu; Baird, D.J. [National Water Research Institute (Environment Canada) at Canadian Rivers Institute, 10 Bailey Drive, PO Box 45111, University of New Brunswick, Fredericton E3B 6E1, New Brunswick (Canada); Nogueira, A.J.A. [Departamento de Biologia, Universidade de Aveiro, 3810-193 Aveiro (Portugal); Soares, A.M.V.M. [Departamento de Biologia, Universidade de Aveiro, 3810-193 Aveiro (Portugal); Riva, M.C. [Laboratory of Environmental Toxicology, Universitat Poltiecnica de Catalunya, CN 150 Km 14.5, Terrassa 08220 (Spain)

    2006-06-10

    Two different concepts, termed concentration addition (CA) and independent action (IA), describe general relationships between the effects of single substances and their corresponding mixtures allowing calculation of an expected mixture toxicity on the basis of known toxicities of the mixture components. Both concepts are limited to cases in which all substances in a mixture influence the same experimental endpoint, and are usually tested against a 'fixed ratio design' where the mixture ratio is kept constant throughout the studies and the overall concentration of the mixture is systematically varied. With this design, interaction among toxic components across different mixture ratios and endpoints (i.e. lethal versus sublethal) is not assessed. In this study lethal and sublethal (feeding) responses of Daphnia magna individuals to single and binary combinations of similarly and dissimilarly acting chemicals including the metals (cadmium, copper) and the pyrethroid insecticides ({lambda}-cyhalothrin and deltamethrin) were assayed using a composite experimental design to test for interactions among toxic components across mixture effect levels, mixture ratios, lethal and sublethal toxic effects. To account for inter-experiment response variability, in each binary mixture toxicity assay the toxicity of the individual mixture constituents was also assessed. Model adequacy was then evaluated comparing the slopes and elevations of predicted versus observed mixture toxicity curves with those estimated for the individual components. Model predictive abilities changed across endpoints. The IA concept was able to predict accurately mixture toxicities of dissimilarly acting chemicals for lethal responses, whereas the CA concept did so in three out of four pairings for feeding response, irrespective of the chemical mode of action. Interaction effects across mixture effect levels, evidenced by crossing slopes, were only observed for the binary mixture Cd and Cu for

  15. Toxicity of binary mixtures of metals and pyrethroid insecticides to Daphnia magna Straus. Implications for multi-substance risks assessment

    International Nuclear Information System (INIS)

    Barata, Carlos; Baird, D.J.; Nogueira, A.J.A.; Soares, A.M.V.M.; Riva, M.C.

    2006-01-01

    Two different concepts, termed concentration addition (CA) and independent action (IA), describe general relationships between the effects of single substances and their corresponding mixtures allowing calculation of an expected mixture toxicity on the basis of known toxicities of the mixture components. Both concepts are limited to cases in which all substances in a mixture influence the same experimental endpoint, and are usually tested against a 'fixed ratio design' where the mixture ratio is kept constant throughout the studies and the overall concentration of the mixture is systematically varied. With this design, interaction among toxic components across different mixture ratios and endpoints (i.e. lethal versus sublethal) is not assessed. In this study lethal and sublethal (feeding) responses of Daphnia magna individuals to single and binary combinations of similarly and dissimilarly acting chemicals including the metals (cadmium, copper) and the pyrethroid insecticides (λ-cyhalothrin and deltamethrin) were assayed using a composite experimental design to test for interactions among toxic components across mixture effect levels, mixture ratios, lethal and sublethal toxic effects. To account for inter-experiment response variability, in each binary mixture toxicity assay the toxicity of the individual mixture constituents was also assessed. Model adequacy was then evaluated comparing the slopes and elevations of predicted versus observed mixture toxicity curves with those estimated for the individual components. Model predictive abilities changed across endpoints. The IA concept was able to predict accurately mixture toxicities of dissimilarly acting chemicals for lethal responses, whereas the CA concept did so in three out of four pairings for feeding response, irrespective of the chemical mode of action. Interaction effects across mixture effect levels, evidenced by crossing slopes, were only observed for the binary mixture Cd and Cu for lethal effects

  16. Deformation Properties and Fatigue of Bituminous Mixtures

    Directory of Open Access Journals (Sweden)

    Frantisek Schlosser

    2013-01-01

    Full Text Available Deformation properties and fatigue performance are important characteristics of asphalt bound materials which are used for construction of pavement layers. Viscoelastic asphalt mixtures are better characterized via dynamic tests. This type of tests allows us to collate materials with regard to axle vibrations which lie usually in the range of 6 Hz–25 Hz for standard conditions. Asphalt modified for heat sensitivity in the range from −20°C to +60°C has significant impact on the overall characteristics of the mixture. Deformation properties are used as inputs for empirical mixture design, and fatigue performance of asphalt mixtures reflects the parameters of functional tests. Master curves convey properties of asphalt mixtures for various conditions and allow us to evaluate them without the need of time expensive testing.

  17. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sanjoy Kumar, E-mail: sanjoydasju@gmail.com; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit{sup ®}RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8 h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2 ± 1.25 μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~ 90%), minimum loss (~ 10%) and prolonged drug release for 8 h (91.25%) which may be considered as favourable criteria of controlled release dosage form. - Graphical abstract: Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design. - Highlights: • Simplex lattice design was used to optimize ketoprofen-loaded microspheres. • Polymeric blend (Ethylcellulose and Eudragit® RL 100) was used. • Microspheres were prepared by oil-in-oil emulsion solvent evaporation method. • Optimized formulation depicted favourable

  18. A primer of statistical methods for correlating parameters and properties of electrospun poly(L-lactide) scaffolds for tissue engineering--PART 1: design of experiments.

    Science.gov (United States)

    Seyedmahmoud, Rasoul; Rainer, Alberto; Mozetic, Pamela; Maria Giannitelli, Sara; Trombetta, Marcella; Traversa, Enrico; Licoccia, Silvia; Rinaldi, Antonio

    2015-01-01

    Tissue engineering scaffolds produced by electrospinning are of enormous interest, but still lack a true understanding about the fundamental connection between the outstanding functional properties, the architecture, the mechanical properties, and the process parameters. Fragmentary results from several parametric studies only render some partial insights that are hard to compare and generally miss the role of parameters interactions. To bridge this gap, this article (Part-1 of 2) features a case study on poly-L-lactide scaffolds to demonstrate how statistical methods such as design of experiments can quantitatively identify the correlations existing between key scaffold properties and control parameters, in a systematic, consistent, and comprehensive manner disentangling main effects from interactions. The morphological properties (i.e., fiber distribution and porosity) and mechanical properties (Young's modulus) are "charted" as a function of molecular weight (MW) and other electrospinning process parameters (the Xs), considering the single effect as well as interactions between Xs. For the first time, the major role of the MW emerges clearly in controlling all scaffold properties. The correlation between mechanical and morphological properties is also addressed. © 2014 Wiley Periodicals, Inc.

  19. The optimization of concrete mixtures for use in highway applications

    Science.gov (United States)

    Moini, Mohamadreza

    . Conducted research enabled further reduction of cement contents to 250 kg/m3 (420 lb/yd3) as required for the design of sustainable concrete pavements. This research demonstrated that aggregate packing can be used in multiple ways as a tool to optimize the aggregates assemblies and achieve the optimal particle size distribution of aggregate blends. The SCMs, and air-entraining admixtures were selected to comply with existing WisDOT performance requirements and chemical admixtures were selected using the separate optimization study excluded from this thesis. The performance of different concrete mixtures was evaluated for fresh properties, strength development, and compressive and flexural strength ranging from 1 to 360 days. The methods and tools discussed in this research are applicable, but not limited to concrete pavement applications. The current concrete proportioning standards such as ACI 211 or current WisDOT roadway standard specifications (Part 5: Structures, Section 501: Concrete) for concrete have limited or no recommendations, methods or guidelines on aggregate optimization, the use of ternary aggregate blends (e.g., such as those used in asphalt industry), the optimization of SCMs (e.g., class F and C fly ash, slag, metakaolin, silica fume), modern superplasticizers (such as polycarboxylate ether, PCE) and air-entraining admixtures. This research has demonstrated that the optimization of concrete mixture proportions can be achieved by the use and proper selection of optimal aggregate blends and result in 12% to 35% reduction of cement content and also more than 50% enhancement of performance. To prove the proposed concrete proportioning method the following steps were performed: • The experimental aggregate packing was investigated using northern and southern source of aggregates from Wisconsin; • The theoretical aggregate packing models were utilized and results were compared with experiments; • Multiple aggregate optimization methods (e.g., optimal

  20. Determination of variation parameters as a crucial step in designing TMT-based clinical proteomics experiments.

    Directory of Open Access Journals (Sweden)

    Evelyne Maes

    Full Text Available In quantitative shotgun proteomic analyses by liquid chromatography and mass spectrometry, a rigid study design is necessary in order to obtain statistically relevant results. Hypothesis testing, sample size calculation and power estimation are fundamental concepts that require consideration upon designing an experiment. For this reason, the reproducibility and variability of the proteomic platform needs to be assessed. In this study, we evaluate the technical (sample preparation, labeling (isobaric labels, and total (biological + technical + labeling + experimental variability and reproducibility of a workflow that employs a shotgun LC-MS/MS approach in combination with TMT peptide labeling for the quantification of peripheral blood mononuclear cell (PBMC proteome. We illustrate that the variability induced by TMT labeling is small when compared to the technical variation. The latter is also responsible for a substantial part of the total variation. Prior knowledge about the experimental variability allows for a correct design, a prerequisite for the detection of biologically significant disease-specific differential proteins in clinical proteomics experiments.

  1. Experiments Planning, Analysis, and Optimization

    CERN Document Server

    Wu, C F Jeff

    2011-01-01

    Praise for the First Edition: "If you . . . want an up-to-date, definitive reference written by authors who have contributed much to this field, then this book is an essential addition to your library."-Journal of the American Statistical Association Fully updated to reflect the major progress in the use of statistically designed experiments for product and process improvement, Experiments, Second Edition introduces some of the newest discoveries-and sheds further light on existing ones-on the design and analysis of experiments and their applications in system optimization, robustness, and tre

  2. Experience in Design and Learning Approaches – Enhancing the Framework for Experience

    Directory of Open Access Journals (Sweden)

    Merja L.M. Bauters

    2017-06-01

    Full Text Available In design and learning studies, an increasing amount of attention has been paid to experience. Many design approaches relate experience to embodiment and phenomenology. The growth in the number of applications that use the Internet of Things (IoT has shifted human interactions from mobile devices and computers to tangible, material things. In education, the pressure to learn and update skills and knowledge, especially in work environments, has underlined the challenge of understanding how workers learn from reflection while working. These directions have been fuelled by research findings in the neurosciences, embodied cognition, the extended phenomenological–cognitive system and the role of emotions in decision-making and meaning making. The perspective on experience in different disciplines varies, and the aim is often to categorise experience. These approaches provide a worthwhile view of the importance of experience in learning and design, such as the recent emphasis on conceptual and epistemological knowledge creation. In pragmatism, experience plays a considerable role in research, art, communication and reflection. Therefore, I rely on Peirce’s communicative theory of signs and Dewey’s philosophy of experience to examine how experience is connected to reflection and therefore how it is necessarily tangible.

  3. Preliminary results of Resistive Plate Chambers operated with eco-friendly gas mixtures for application in the CMS experiment

    CERN Document Server

    Abbrescia, Marcello; Benussi, Luigi; Bianco, Stefano; Cauwenbergh, Simon Marc D; Ferrini, Mauro; Muhammad, Saleh; Passamontic, L; Pierluigi, Daniele; Piccolo, Davide; Primavera, Federica; Russo, Alessandro; Savianoc, G; Tytgat, Michael

    2016-01-01

    The operations of Resistive Plate Chambers in LHC experiments require F-based gases for optimal performance. Recent regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. In view of the CMS experiment upgrade several tests are ongoing to measure the performance of the detector in terms of efficiency, streamer probability, induced charge and time resolution. Prototype chambers with readout pads and with the standard cms electronic setup are under test. In this talk preliminary results on performance of RPCs operated with a potential eco-friendly gas candidate 1,3,3,3-Tetrafluoropropene, commercially known as HFO-1234ze and with CO2 based gas mixtures are presented and discussed for the possible application in the CMS experiment.

  4. Nutritional and sensory characteristics of gluten-free quinoa (Chenopodium quinoa Willd)-based cookies development using an experimental mixture design.

    Science.gov (United States)

    Brito, Isabelle L; de Souza, Evandro Leite; Felex, Suênia Samara Santos; Madruga, Marta Suely; Yamashita, Fábio; Magnani, Marciane

    2015-09-01

    The aim of this study was to develop a gluten-free formulation of quinoa (Chenopodium quinoa Willd.)-based cookies using experimental design of mixture to optimize a ternary mixture of quinoa flour, quinoa flakes and corn starch for parameters of colour, specific volume and hardness. Nutritional and sensory aspects of the optimized formulation were also assessed. Corn starch had a positive effect on the lightness of the cookies, but increased amounts of quinoa flour and quinoa flakes in the mixture resulted in darker product. Quinoa flour showed a negative effect on the specific volume, producing less bulky cookies, and quinoa flour and quinoa flakes had a positive synergistic effect on the hardness of the cookies. According the results and considering the desirability profile for colour, hardness and specific volume in gluten-free cookies, the optimized formulation contains 30 % quinoa flour, 25 % quinoa flakes and 45 % corn starch. The quinoa-based cookie obtained was characterized as a product rich in dietary fibre, a good source of essential amino acids, linolenic acid and minerals, with good sensory acceptability. These findings reports for the first time the application of quinoa processed as flour and flakes in mixture with corn starch as an alternative ingredient for formulations of gluten-free cookies-type biscuits.

  5. Design of experiments

    International Nuclear Information System (INIS)

    Drijard, D.

    1978-01-01

    The paper is mostly devoted to simulation problems. The part which concerns detector optimization was essentially treated during the School in a seminar about the Split-Field Magnet (SFM) detector installed at the CERN Intersecting Storage Rings (ISR). This is not given in the written notes since very little of general use can be said about this subject, unless very trivial. The author describes in a detailed way the tools which allow such studies to be made. The notes start by a summary of statistical terms. The main emphasis is then put on Monte Carlo methods and generation of random variables. The last section treats the utilization of detector acceptance, which will be one of the most important parts to optimize when designing a detector. (Auth.)

  6. Fission product release from core-concrete mixtures

    International Nuclear Information System (INIS)

    Roche, M.F.; Settle, J.; Leibowitz, L.; Johnson, C.E.; Ritzman, R.L.

    1988-01-01

    The objective of this research is to measure the amount of strontium, barium, and lanthanum that is vaporized from core-concrete mixtures. The measurements are being done using a transpiration method. Mixtures of limestone-aggregated concrete, urania doped with a small amount of La, Sr, Ba, and Zr oxides, and stainless steel were vaporized at 2150 K from a zirconia crucible into flowing He-6% H 2 -0.06% H 2 O (a partial molar free energy of oxygen of -420 kJ). The amounts that were vaporized was determined by weight change and by chemical analyses on condensates. The major phases present in the mixture were inferred from electron probe microanalysis (EPM). They were: (1) urania containing calcia and zirconia, (2) calcium zirconate, (3) a calcium magnesium silicate, and (4) magnesia. About 10% of the zirconia crucible was dissolved by the concrete-urania mixture during the experiment, which accounts for the presence of zirconia-containing major phases. To circumvent the problem of zirconia dissolution, we repeated the experiments using mixtures of the limestone-aggregate concrete and the doped urania in molybdenum crucibles. These studies show that thermodynamic calculations of the release of refractory fission products will yield release fractions that are a factor of sixteen too high if the effects of zirconate formation are ignored

  7. THE INTEGRATED SHORT-TERM STATISTICAL SURVEYS: EXPERIENCE OF NBS IN MOLDOVA

    Directory of Open Access Journals (Sweden)

    Oleg CARA

    2012-07-01

    Full Text Available The users’ rising need for relevant, reliable, coherent, timely data for the early diagnosis of the economic vulnerability and of the turning points in the business cycles, especially during a financial and economic crisis, asks for a prompt answer, coordinated by statistical institutions. High quality short term statistics are of special interest for the emerging market economies, such as the Moldavian one, being extremely vulnerable when facing economic recession. Answering to the challenges of producing a coherent and adequate image of the economic activity, by using the system of indicators and definitions efficiently applied at the level of the European Union, the National Bureau of Statistics (NBS of the Republic of Moldova has launched the development process of an integrated system of short term statistics (STS based on the advanced international experience.Thus, in 2011, BNS implemented the integrated statistical survey on STS based on consistent concepts, harmonized with the EU standards. The integration of the production processes, which were previously separated, is based on a common technical infrastructure, standardized procedures and techniques for data production. The achievement of this complex survey with holistic approach has allowed the consolidation of the statistical data quality, comparable at European level and the signifi cant reduction of information burden on business units, especially of small size.The reformation of STS based on the integrated survey has been possible thanks to the consistent methodological and practical support given to NBS by the National Institute of Statistics (INS of Romania, for which we would like to thank to our Romanian colleagues.

  8. Graphene/TiO_2/ZSM-5 composites synthesized by mixture design were used for photocatalytic degradation of oxytetracycline under visible light: Mechanism and biotoxicity

    International Nuclear Information System (INIS)

    Hu, Xin-Yan; Zhou, Kefu; Chen, Bor-Yann; Chang, Chang-Tang

    2016-01-01

    Graphical abstract: The mechanism of OTC degradation can be described as follows. At first, the OTC molecule was adsorbed onto the surface of GTZ material. The conduction band electron (e"−) and valence band holes (h"+) are generated when aqueous GTZ suspension is irradiated with visible light. The generation of (e"−/h+) pair leading to the formation of reactive oxygen species. The ·OH radical and ·O_2"− can oxidize OTC molecular, resulting in the degradation and mineralization of the organics. - Highlights: • Determine optimal composites of graphene, TiO_2, and zeolite for maximal photodegradation efficiency via triangular mixture design. • Unravel most promising composites for high stability and absorptive capabilities for photocatalytic degradation. • Disclose time-series profiles of toxicity of advanced oxidation processes (AOPs) treatment of wastewater. • Propose plausible routes of mechanism of photocatalytical degradation of OTC. - Abstract: This first-attempt study revealed mixture design of experiments to obtain the most promising composites of TiO_2 loaded on zeolite and graphene for maximal photocatalytic degradation of oxytetracycline (OTC). The optimal weight ratio of graphene, titanium dioxide (TiO_2), and zeolite was 1:8:1 determined via experimental design of simplex lattice mixture. The composite material was characterized by XRD, UV–vis, TEM and EDS analysis. The findings showed the composite material had a higher stability and a stronger absorption of the visible light. In addition, it was uniformly dispersed with promising adsorption characteristics. OTC was used as model toxicant to evaluate the photodegradation efficiency of the GTZ (1:8:1). At optimal operating conditions (i.e., pH 7 and 25 °C), complete degradation (ca. 100%) was achieved in 180 min. The biotoxicity of the degraded intermediates of OTC on cell growth of Escherichia coli DH5α were also assayed. After 180 min photocatalytic treatment, OTC solution treated

  9. International Conference on Trends and Perspectives in Linear Statistical Inference

    CERN Document Server

    Rosen, Dietrich

    2018-01-01

    This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference. .

  10. STORYPLY : designing for user experiences using storycraft

    NARCIS (Netherlands)

    Atasoy, B.; Martens, J.B.O.S.; Markopoulos, P.; Martens, J.B.; Malins, J.; Coninx, K.; Liapis, A.

    2016-01-01

    The role of design shifts from designing objects towards designing for experiences. The design profession has to follow this trend but the current skill-set of designers focuses mainly on objects; their form, function, manufacturing and interaction. However, contemporary methods and tools that

  11. The Use of D-Optimal Mixture Design in Optimising Okara Soap Formulation for Stratum Corneum Application

    Science.gov (United States)

    Borhan, Farrah Payyadhah; Abd Gani, Siti Salwa; Shamsuddin, Rosnah

    2014-01-01

    Okara, soybean waste from tofu and soymilk production, was utilised as a natural antioxidant in soap formulation for stratum corneum application. D-optimal mixture design was employed to investigate the influence of the main compositions of okara soap containing different fatty acid and oils (virgin coconut oil A (24–28% w/w), olive oil B (15–20% w/w), palm oil C (6–10% w/w), castor oil D (15–20% w/w), cocoa butter E (6–10% w/w), and okara F (2–7% w/w)) by saponification process on the response hardness of the soap. The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for okara soap hardness in terms of the six design factors considered in this study. Results revealed that the best mixture was the formulation that included 26.537% A, 19.999% B, 9.998% C, 16.241% D, 7.633% E, and 7.000% F. The results proved that the difference in the level of fatty acid and oils in the formulation significantly affects the hardness of soap. Depending on the desirable level of those six variables, creation of okara based soap with desirable properties better than those of commercial ones is possible. PMID:25548777

  12. The Use of D-Optimal Mixture Design in Optimising Okara Soap Formulation for Stratum Corneum Application

    Directory of Open Access Journals (Sweden)

    Farrah Payyadhah Borhan

    2014-01-01

    Full Text Available Okara, soybean waste from tofu and soymilk production, was utilised as a natural antioxidant in soap formulation for stratum corneum application. D-optimal mixture design was employed to investigate the influence of the main compositions of okara soap containing different fatty acid and oils (virgin coconut oil A (24–28% w/w, olive oil B (15–20% w/w, palm oil C (6–10% w/w, castor oil D (15–20% w/w, cocoa butter E (6–10% w/w, and okara F (2–7% w/w by saponification process on the response hardness of the soap. The experimental data were utilized to carry out analysis of variance (ANOVA and to develop a polynomial regression model for okara soap hardness in terms of the six design factors considered in this study. Results revealed that the best mixture was the formulation that included 26.537% A, 19.999% B, 9.998% C, 16.241% D, 7.633% E, and 7.000% F. The results proved that the difference in the level of fatty acid and oils in the formulation significantly affects the hardness of soap. Depending on the desirable level of those six variables, creation of okara based soap with desirable properties better than those of commercial ones is possible.

  13. An Exercise on Calibration: DRIFTS Study of Binary Mixtures of Calcite and Dolomite with Partially Overlapping Spectral Features

    Science.gov (United States)

    De Lorenzi Pezzolo, Alessandra

    2013-01-01

    Unlike most spectroscopic calibrations that are based on the study of well-separated features ascribable to the different components, this laboratory experience is especially designed to exploit spectral features that are nearly overlapping. The investigated system consists of a binary mixture of two commonly occurring minerals, calcite and…

  14. Protective effect of Hongxue tea mixture against radiation injury in mice

    International Nuclear Information System (INIS)

    Zhao Chun; Zhang Xuehui; Wang Qi

    2005-01-01

    Objective: To develop health food of anti-radiation among biological source in Yunnan. Methods: Screening test was done of the health food of biological source of anti-radiation injury in mice. It is indicated that Hong-Xue Tea Mixture among the biological source has the effect against radiation injury, observing experiment of dose-effect of Hong-Xue Tea Mixture was done. Micronuclei in the bone marrow polychromatophilic erythrocytes in each dose group of mice were examined, leucocytes number and 30 day survival rate of mice following whole-body 5.0 Gy γ irradiation were also determined. Results: Research showed that Hong-Xue Tea Mixture and Spirulina Platensis Mixture among the biological source have protective effect against radiation injury in mice. Observing experiment of dose-effect of Hong-Xue Tea Mixture show that low, medium and high dose of Hong-Xue Tea Mixture can significantly decrease bone marrow PECMN rate of mice, increase leucocytes number and 30 day survival rate. Conclusion: Hong-Xue Tea Mixture has potent protective effects against radiation injury in mice. (authors)

  15. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  16. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  17. Functionality of disintegrants and their mixtures in enabling fast disintegration of tablets by a quality by design approach.

    Science.gov (United States)

    Desai, Parind Mahendrakumar; Er, Patrick Xuan Hua; Liew, Celine Valeria; Heng, Paul Wan Sia

    2014-10-01

    Investigation of the effect of disintegrants on the disintegration time and hardness of rapidly disintegrating tablets (RDTs) was carried out using a quality by design (QbD) paradigm. Ascorbic acid, aspirin, and ibuprofen, which have different water solubilities, were chosen as the drug models. Disintegration time and hardness of RDTs were determined and modeled by executing combined optimal design. The generated models were validated and used for further analysis. Sodium starch glycolate, croscarmellose sodium, and crospovidone were found to lengthen disintegration time when utilized at high concentrations. Sodium starch glycolate and crospovidone worked synergistically in aspirin RDTs to decrease disintegration time. Sodium starch glycolate-crospovidone mixtures, as well as croscarmellose sodium-crospovidone mixtures, also decreased disintegration time in ibuprofen RDTs at high compression pressures as compared to the disintegrants used alone. The use of sodium starch glycolate in RDTs with highly water soluble active ingredients like ascorbic acid slowed disintegration, while microcrystalline cellulose and crospovidone drew water into the tablet rapidly and quickened disintegration. Graphical optimization analysis demonstrated that the RDTs with desired disintegration times and hardness can be formulated with a larger area of design space by combining disintegrants at difference compression pressures. QbD was an efficient and effective paradigm in understanding formulation and process parameters and building quality in to RDT formulated systems.

  18. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  19. Study on mechanical properties of fly ash impregnated glass fiber reinforced polymer composites using mixture design analysis

    International Nuclear Information System (INIS)

    Satheesh Raja, R.; Manisekar, K.; Manikandan, V.

    2014-01-01

    Highlights: • FRP with and without fly ash filler were prepared. • Mechanical properties of composites were analyzed. • Mixture Design Method was used to model the system. • Experimental and mathematical model results were compared. - Abstract: This paper describes the mechanical behavior of fly ash impregnated E-glass fiber reinforced polymer composite (GFRP). Initially the proportion of fiber and resin were optimized from the analysis of the mechanical properties of the GFRP. It is observed that the 30 wt% of E-glass in the GFRP without filler material yields better results. Then, based on the optimized value of resin content, the varying percentage of E-glass and fly ash was added to fabricate the hybrid composites. Results obtained in this study were mathematically evaluated using Mixture Design Method. Predictions show that 10 wt% addition of fly ash with fiber improves the mechanical properties of the composites. The fly ash impregnated GFRP yields significant improvement in mechanical strength compared to the GFRP without filler material. The surface morphologies of the fractured specimens were characterized using Scanning Electron Microscope (SEM). The chemical composition and surface morphology of the fly ash is analyzed by using Energy Dispersive Spectroscopy (EDS) and Scanning Electron Microscope

  20. Gaussian Process-Mixture Conditional Heteroscedasticity.

    Science.gov (United States)

    Platanios, Emmanouil A; Chatzis, Sotirios P

    2014-05-01

    Generalized autoregressive conditional heteroscedasticity (GARCH) models have long been considered as one of the most successful families of approaches for volatility modeling in financial return series. In this paper, we propose an alternative approach based on methodologies widely used in the field of statistical machine learning. Specifically, we propose a novel nonparametric Bayesian mixture of Gaussian process regression models, each component of which models the noise variance process that contaminates the observed data as a separate latent Gaussian process driven by the observed data. This way, we essentially obtain a Gaussian process-mixture conditional heteroscedasticity (GPMCH) model for volatility modeling in financial return series. We impose a nonparametric prior with power-law nature over the distribution of the model mixture components, namely the Pitman-Yor process prior, to allow for better capturing modeled data distributions with heavy tails and skewness. Finally, we provide a copula-based approach for obtaining a predictive posterior for the covariances over the asset returns modeled by means of a postulated GPMCH model. We evaluate the efficacy of our approach in a number of benchmark scenarios, and compare its performance to state-of-the-art methodologies.

  1. The ultrasound-assisted oxidative scission of monoenic fatty acids by ruthenium tetroxide catalysis: influence of the mixture of solvents.

    Science.gov (United States)

    Rup, Sandrine; Zimmermann, François; Meux, Eric; Schneider, Michel; Sindt, Michele; Oget, Nicolas

    2009-02-01

    Carboxylic acids and diacids were synthesized from monoenic fatty acids by using RuO4 catalysis, under ultrasonic irradiation, in various mixtures of solvents. Ultrasound associated with Aliquat 336 have promoted in water, the quantitative oxidative cleavage of the CH=CH bond of oleic acid. A design of experiment (DOE) shows that the optimal mixture of solvents (H2O/MeCN, ratio 1/1, 2.2% RuCl3/4.1 eq. NaIO4) gives 81% azelaic acid and 97% pelargonic acid. With the binary heterogeneous mixture H2O/AcOEt, the oxidation of the oleic acid leads to a third product, the alpha-dione 9,10-dioxostearic acid.

  2. The Mathematics of Symmetrical Factorial Designs

    Indian Academy of Sciences (India)

    The Mathematics of Symmetrical Factorial Designs. Mausumi Bose (nee Sen) obtained her MSc degree in. Statistics from the Calcutta. University and PhD degree from the Indian Statistical. Institute. She is on the faculty of the Indian. Statistical Institute. Her main field of research interest is design and analysis of experiments.

  3. Impact of LMFBR operating experience on PFBR design

    International Nuclear Information System (INIS)

    Bhoje, S.B.; Chetal, S.C.; Chellapandi, P.; Govindarajan, S.; Lee, S.M.; Kameswara Rao, A.S.L.; Prabhakar, R.; Raghupathy, S.; Sodhi, B.S.; Sundaramoorthy, T.R.; Vaidyanathan, G.

    2000-01-01

    PFBR is a 500 MWe, sodium cooled, pool type, fast breeder reactor currently under detailed design. It is essential to reduce the capital cost of PFBR in order to make it competitive with thermal reactors. Operating experience of LMFBRs provides a vital input towards simplification of the design, improving its reliability, enhancing safety and achieving overall cost reduction. This paper includes a summary of LMFBR operating experience and details the design features of PFBR as influenced by operating experience of LMFBRs. (author)

  4. Fear and loathing: undergraduate nursing students' experiences of a mandatory course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie

    2013-04-23

    This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.

  5. Experimental mixture design as a tool to optimize the growth of various Ganoderma species cultivated on media with different sugars

    Directory of Open Access Journals (Sweden)

    Yit Kheng Goh

    2016-01-01

    Full Text Available The influence of different medium components (glucose, sucrose, and fructose on the growth of different Ganoderma isolates and species was investigated using mixture design. Ten sugar combinations based on three simple sugars were generated with two different concentrations, namely 3.3% and 16.7%, which represented low and high sugar levels, respectively. The media were adjusted to either pH 5 or 8. Ganoderma isolates (two G. boninense from oil palm, one Ganoderma species from coconut palm, G. lingzhi, and G. australe from tower tree grew faster at pH 8. Ganoderma lingzhi proliferated at the slowest rate compared to all other tested Ganoderma species in all the media studied. However, G. boninense isolates grew the fastest. Different Ganoderma species were found to have different sugar preferences. This study illustrated that the mixture design can be used to determine the optimal combinations of sugar or other nutrient/chemical components of media for fungal growth.

  6. Evaluation of Accuracy of Calculational Prediction of Criticality Based on ICSBEP Handbook Experiments

    International Nuclear Information System (INIS)

    Golovko, Yury; Rozhikhin, Yevgeniy; Tsibulya, Anatoly; Koscheev, Vladimir

    2008-01-01

    Experiments with plutonium, low enriched uranium and uranium-233 from the ICSBEP Handbook are being considered in this paper. Among these experiments it was selected only those, which seem to be the most relevant to the evaluation of uncertainty of critical mass of mixtures of plutonium or low enriched uranium or uranium-233 with light water. All selected experiments were examined and covariance matrices of criticality uncertainties were developed along with some uncertainties were revised. Statistical analysis of these experiments was performed and some contradictions were discovered and eliminated. Evaluation of accuracy of prediction of criticality calculations was performed using the internally consistent set of experiments with plutonium, low enriched uranium and uranium-233 remained after the statistical analyses. The application objects for the evaluation of calculational prediction of criticality were water-reflected spherical systems of homogeneous aqueous mixtures of plutonium or low enriched uranium or uranium-233 of different concentrations which are simplified models of apparatus of external fuel cycle. It is shows that the procedure allows to considerably reduce uncertainty in k eff caused by the uncertainties in neutron cross-sections. Also it is shows that the results are practically independent of initial covariance matrices of nuclear data uncertainties. (authors)

  7. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    Science.gov (United States)

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  8. Modeling abundance using N-mixture models: the importance of considering ecological mechanisms.

    Science.gov (United States)

    Joseph, Liana N; Elkin, Ché; Martin, Tara G; Possinghami, Hugh P

    2009-04-01

    Predicting abundance across a species' distribution is useful for studies of ecology and biodiversity management. Modeling of survey data in relation to environmental variables can be a powerful method for extrapolating abundances across a species' distribution and, consequently, calculating total abundances and ultimately trends. Research in this area has demonstrated that models of abundance are often unstable and produce spurious estimates, and until recently our ability to remove detection error limited the development of accurate models. The N-mixture model accounts for detection and abundance simultaneously and has been a significant advance in abundance modeling. Case studies that have tested these new models have demonstrated success for some species, but doubt remains over the appropriateness of standard N-mixture models for many species. Here we develop the N-mixture model to accommodate zero-inflated data, a common occurrence in ecology, by employing zero-inflated count models. To our knowledge, this is the first application of this method to modeling count data. We use four variants of the N-mixture model (Poisson, zero-inflated Poisson, negative binomial, and zero-inflated negative binomial) to model abundance, occupancy (zero-inflated models only) and detection probability of six birds in South Australia. We assess models by their statistical fit and the ecological realism of the parameter estimates. Specifically, we assess the statistical fit with AIC and assess the ecological realism by comparing the parameter estimates with expected values derived from literature, ecological theory, and expert opinion. We demonstrate that, despite being frequently ranked the "best model" according to AIC, the negative binomial variants of the N-mixture often produce ecologically unrealistic parameter estimates. The zero-inflated Poisson variant is preferable to the negative binomial variants of the N-mixture, as it models an ecological mechanism rather than a

  9. Composition dependence of the synergistic effect of nucleating agent and plasticizer in poly(lactic acid: A Mixture Design study

    Directory of Open Access Journals (Sweden)

    M. K. Fehri

    2016-04-01

    Full Text Available Blends consisting of commercial poly(lactic acid (PLA, poly(lactic acid oligomer (OLA8 as plasticizer and a sulfonic salt of a phthalic ester and poly(D-lactic acid as nucleating agents were prepared by melt extrusion, following a Mixture Design approach, in order to systematically study mechanical and thermal properties as a function of composition. The full investigation was carried out by differential scanning calorimetry (DSC, dynamic mechanical thermal analysis (DMTA and tensile tests. The crystallization half-time was also studied at 105 °C as a function of the blends composition. A range of compositions in which the plasticizer and the nucleation agent minimized the crystallization half-time in a synergistic way was clearly identified thanks to the application of the Mixture Design approach. The results allowed also the identification of a composition range to maximize the crystallinity developed during the rapid cooling below glass transition temperature in injection moulding, thus allowing an easier processing of PLA based materials. Moreover the mechanical properties were discussed by correlating them to the chemical structural features and thermal behaviour of blends.

  10. Systematic study of RPC performances in polluted or varying gas mixtures compositions: an online monitor system for the RPC gas mixture at LHC

    CERN Document Server

    Capeans, M; Mandelli, B

    2012-01-01

    The importance of the correct gas mixture for the Resistive Plate Chamber (RPC) detector systems is fundamental for their correct and safe operation. A small change in the percentages of the gas mixture components can alter the RPC performance and this will rebound on the data quality in the ALICE, ATLAS and CMS experiments at CERN. A constant monitoring of the gas mixture injected in the RPCs would avoid such kind of problems. A systematic study has been performed to understand RPC performances with several gas mixture compositions and in the presence of common gas impurities. The systematic analysis of several RPC performance parameters in different gas mixtures allows the rapid identification of any variation in the RPC gas mixture. A set-up for the online monitoring of the RPC gas mixture in the LHC gas systems is also proposed.

  11. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  12. Design -|+ Negative emotions for positive experiences

    NARCIS (Netherlands)

    Fokkinga, S.F.

    2015-01-01

    Experience-driven design considers all aspects of a product – its appearance, cultural meaning, functionality, interaction, usability, technology, and indirect consequences of use – with the aim to optimize and orchestrate all these aspects and create the best possible user experience. Since the

  13. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  14. DEM Calibration Approach: design of experiment

    Science.gov (United States)

    Boikov, A. V.; Savelev, R. V.; Payor, V. A.

    2018-05-01

    The problem of DEM models calibration is considered in the article. It is proposed to divide models input parameters into those that require iterative calibration and those that are recommended to measure directly. A new method for model calibration based on the design of the experiment for iteratively calibrated parameters is proposed. The experiment is conducted using a specially designed stand. The results are processed with technical vision algorithms. Approximating functions are obtained and the error of the implemented software and hardware complex is estimated. The prospects of the obtained results are discussed.

  15. Using Bayesian statistics for modeling PTSD through Latent Growth Mixture Modeling : implementation and discussion

    NARCIS (Netherlands)

    Depaoli, Sarah; van de Schoot, Rens; van Loey, Nancy; Sijbrandij, Marit

    2015-01-01

    BACKGROUND: After traumatic events, such as disaster, war trauma, and injuries including burns (which is the focus here), the risk to develop posttraumatic stress disorder (PTSD) is approximately 10% (Breslau & Davis, 1992). Latent Growth Mixture Modeling can be used to classify individuals into

  16. Design Optimization of a Micro-Combustor for Lean, Premixed Fuel-Air Mixtures

    Science.gov (United States)

    Powell, Leigh Theresa

    Present technology has been shifting towards miniaturization of devices for energy production for portable electronics. Micro-combustors, when incorporated into a micro-power generation system, provide the energy desired in the form of hot gases to power such technology. This creates the need for a design optimization of the micro-combustor in terms of geometry, fuel choice, and material selection. A total of five micro-combustor geometries, three fuels, and three materials were computationally simulated in different configurations in order to determine the optimal micro-combustor design for highest efficiency. Inlet velocity, equivalence ratio, and wall heat transfer coefficient were varied in order to test a comprehensive range of micro-combustor parameters. All simulations completed for the optimization study used ANSYS Fluent v16.1 and post-processing of the data was done in CFD Post v16.1. It was found that for lean, premixed fuel-air mixtures (φ = 0.6 - 0.9) ethane (C 2H6) provided the highest flame temperatures when ignited within the micro-combustor geometries. An aluminum oxide converging micro-combustor burning ethane and air at an equivalence ratio of 0.9, an inlet velocity of 0.5 m/s, and heat transfer coefficient of 5 W/m2-K was found to produce the highest combustor efficiency, making it the optimal choice for a micro-combustor design. It is proposed that this geometry be experimentally and computationally investigated further in order to determine if additional optimization can be achieved.

  17. Putting Priors in Mixture Density Mercer Kernels

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  18. Small Satellite Mechanical Design Experience

    OpenAIRE

    Meyers, Stewart

    1993-01-01

    The design approach used and the experience gained in the building of four small satellite payloads is explained. Specific recommendations are made and the lessons learned on the SAMPEX program are detailed.

  19. Selective recovery of tagatose from mixtures with galactose by direct extraction with supercritical CO2 and different cosolvents.

    Science.gov (United States)

    Montañés, Fernando; Fornari, Tiziana; Martín-Alvarez, Pedro J; Corzo, Nieves; Olano, Agustin; Ibañez, Elena

    2006-10-18

    A selective fractionation method of carbohydrate mixtures of galactose/tagatose, using supercritical CO(2) and isopropanol as cosolvent, has been evaluated. Optimization was carried out using a central composite face design and considering as factors the extraction pressure (from 100 to 300 bar), the extraction temperature (from 60 to 100 degrees C), and the modifier flow rate (from 0.2 to 0.4 mL/min, which corresponded to a total cosolvent percentage ranging from 4 to 18% vol). The responses evaluated were the amount (milligrams) of tagatose and galactose extracted and their recoveries (percent). The statistical analysis of the results provided mathematical models for each response variable. The corresponding parameters were estimated by multiple linear regression, and high determination coefficients (>0.96) were obtained. The optimum conditions of the extraction process to get the maximum recovery of tagatose (37%) were 300 bar, 60 degrees C, and 0.4 mL/min of cosolvent. The predicted value was 24.37 mg of tagatose, whereas the experimental value was 26.34 mg, which is a 7% error from the predicted value. Cosolvent polarity effects on tagatose extraction from mixtures of galactose/tagatose were also studied using different alcohols and their mixtures with water. Although a remarkable increase of the amount of total carbohydrate extracted with polarity was found, selective extraction of tagatose decreased with increase of polarity of assayed cosolvents. To improve the recovery of extracted tagatose, additional experiments outside the experimental domain were carried out (300 bar, 80 degrees C, and 0.6 mL/min of isopropanol); recoveries >75% of tagatose with purity >90% were obtained.

  20. Graphene/TiO{sub 2}/ZSM-5 composites synthesized by mixture design were used for photocatalytic degradation of oxytetracycline under visible light: Mechanism and biotoxicity

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Xin-Yan; Zhou, Kefu [College of the Environment and Ecology, Xiamen University, Xiamen (China); Chen, Bor-Yann, E-mail: boryannchen@yahoo.com.tw [Department of Chemical and Materials Engineering, National I-Lan University, Ilan, Taiwan (China); Chang, Chang-Tang, E-mail: ctchang73222@gmail.com [Department of Environmental Engineering, National I-Lan University, Ilan, Taiwan (China)

    2016-01-30

    Graphical abstract: The mechanism of OTC degradation can be described as follows. At first, the OTC molecule was adsorbed onto the surface of GTZ material. The conduction band electron (e{sup −}) and valence band holes (h{sup +}) are generated when aqueous GTZ suspension is irradiated with visible light. The generation of (e{sup −}/h+) pair leading to the formation of reactive oxygen species. The ·OH radical and ·O{sub 2}{sup −} can oxidize OTC molecular, resulting in the degradation and mineralization of the organics. - Highlights: • Determine optimal composites of graphene, TiO{sub 2}, and zeolite for maximal photodegradation efficiency via triangular mixture design. • Unravel most promising composites for high stability and absorptive capabilities for photocatalytic degradation. • Disclose time-series profiles of toxicity of advanced oxidation processes (AOPs) treatment of wastewater. • Propose plausible routes of mechanism of photocatalytical degradation of OTC. - Abstract: This first-attempt study revealed mixture design of experiments to obtain the most promising composites of TiO{sub 2} loaded on zeolite and graphene for maximal photocatalytic degradation of oxytetracycline (OTC). The optimal weight ratio of graphene, titanium dioxide (TiO{sub 2}), and zeolite was 1:8:1 determined via experimental design of simplex lattice mixture. The composite material was characterized by XRD, UV–vis, TEM and EDS analysis. The findings showed the composite material had a higher stability and a stronger absorption of the visible light. In addition, it was uniformly dispersed with promising adsorption characteristics. OTC was used as model toxicant to evaluate the photodegradation efficiency of the GTZ (1:8:1). At optimal operating conditions (i.e., pH 7 and 25 °C), complete degradation (ca. 100%) was achieved in 180 min. The biotoxicity of the degraded intermediates of OTC on cell growth of Escherichia coli DH5α were also assayed. After 180 min

  1. Sequential Estimation of Mixtures in Diffusion Networks

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Reichl, Jan; Djurić, P. M.

    2015-01-01

    Roč. 22, č. 2 (2015), s. 197-201 ISSN 1070-9908 R&D Projects: GA ČR(CZ) GP14-06678P Institutional support: RVO:67985556 Keywords : distributed estimation * mixture models * bayesian inference Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.661, year: 2015 http://library.utia.cas.cz/separaty/2014/AS/dedecius-0431479.pdf

  2. Optimized Experiment Design for Marine Systems Identification

    DEFF Research Database (Denmark)

    Blanke, M.; Knudsen, Morten

    1999-01-01

    Simulation of maneuvring and design of motion controls for marine systems require non-linear mathematical models, which often have more than one-hundred parameters. Model identification is hence an extremely difficult task. This paper discusses experiment design for marine systems identification...... and proposes a sensitivity approach to solve the practical experiment design problem. The applicability of the sensitivity approach is demonstrated on a large non-linear model of surge, sway, roll and yaw of a ship. The use of the method is illustrated for a container-ship where both model and full-scale tests...

  3. Advanced statistical analysis of Raman spectroscopic data for the identification of body fluid traces: semen and blood mixtures.

    Science.gov (United States)

    Sikirzhytski, Vitali; Sikirzhytskaya, Aliaksandra; Lednev, Igor K

    2012-10-10

    Conventional confirmatory biochemical tests used in the forensic analysis of body fluid traces found at a crime scene are destructive and not universal. Recently, we reported on the application of near-infrared (NIR) Raman microspectroscopy for non-destructive confirmatory identification of pure blood, saliva, semen, vaginal fluid and sweat. Here we expand the method to include dry mixtures of semen and blood. A classification algorithm was developed for differentiating pure body fluids and their mixtures. The classification methodology is based on an effective combination of Support Vector Machine (SVM) regression (data selection) and SVM Discriminant Analysis of preprocessed experimental Raman spectra collected using an automatic mapping of the sample. This extensive cross-validation of the obtained results demonstrated that the detection limit of the minor contributor is as low as a few percent. The developed methodology can be further expanded to any binary mixture of complex solutions, including but not limited to mixtures of other body fluids. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Statistical learning from nonrecurrent experience with discrete input variables and recursive-error-minimization equations

    Science.gov (United States)

    Carter, Jeffrey R.; Simon, Wayne E.

    1990-08-01

    Neural networks are trained using Recursive Error Minimization (REM) equations to perform statistical classification. Using REM equations with continuous input variables reduces the required number of training experiences by factors of one to two orders of magnitude over standard back propagation. Replacing the continuous input variables with discrete binary representations reduces the number of connections by a factor proportional to the number of variables reducing the required number of experiences by another order of magnitude. Undesirable effects of using recurrent experience to train neural networks for statistical classification problems are demonstrated and nonrecurrent experience used to avoid these undesirable effects. 1. THE 1-41 PROBLEM The statistical classification problem which we address is is that of assigning points in ddimensional space to one of two classes. The first class has a covariance matrix of I (the identity matrix) the covariance matrix of the second class is 41. For this reason the problem is known as the 1-41 problem. Both classes have equal probability of occurrence and samples from both classes may appear anywhere throughout the ddimensional space. Most samples near the origin of the coordinate system will be from the first class while most samples away from the origin will be from the second class. Since the two classes completely overlap it is impossible to have a classifier with zero error. The minimum possible error is known as the Bayes error and

  5. Iterative Mixture Component Pruning Algorithm for Gaussian Mixture PHD Filter

    Directory of Open Access Journals (Sweden)

    Xiaoxi Yan

    2014-01-01

    Full Text Available As far as the increasing number of mixture components in the Gaussian mixture PHD filter is concerned, an iterative mixture component pruning algorithm is proposed. The pruning algorithm is based on maximizing the posterior probability density of the mixture weights. The entropy distribution of the mixture weights is adopted as the prior distribution of mixture component parameters. The iterative update formulations of the mixture weights are derived by Lagrange multiplier and Lambert W function. Mixture components, whose weights become negative during iterative procedure, are pruned by setting corresponding mixture weights to zeros. In addition, multiple mixture components with similar parameters describing the same PHD peak can be merged into one mixture component in the algorithm. Simulation results show that the proposed iterative mixture component pruning algorithm is superior to the typical pruning algorithm based on thresholds.

  6. Designing Effective Undergraduate Research Experiences

    Science.gov (United States)

    Severson, S.

    2010-12-01

    I present a model for designing student research internships that is informed by the best practices of the Center for Adaptive Optics (CfAO) Professional Development Program. The dual strands of the CfAO education program include: the preparation of early-career scientists and engineers in effective teaching; and changing the learning experiences of students (e.g., undergraduate interns) through inquiry-based "teaching laboratories." This paper will focus on the carry-over of these ideas into the design of laboratory research internships such as the CfAO Mainland internship program as well as NSF REU (Research Experiences for Undergraduates) and senior-thesis or "capstone" research programs. Key ideas in maximizing student learning outcomes and generating productive research during internships include: defining explicit content, scientific process, and attitudinal goals for the project; assessment of student prior knowledge and experience, then following up with formative assessment throughout the project; setting reasonable goals with timetables and addressing motivation; and giving students ownership of the research by implementing aspects of the inquiry process within the internship.

  7. Affective loop experiences: designing for interactional embodiment.

    Science.gov (United States)

    Höök, Kristina

    2009-12-12

    Involving our corporeal bodies in interaction can create strong affective experiences. Systems that both can be influenced by and influence users corporeally exhibit a use quality we name an affective loop experience. In an affective loop experience, (i) emotions are seen as processes, constructed in the interaction, starting from everyday bodily, cognitive or social experiences; (ii) the system responds in ways that pull the user into the interaction, touching upon end users' physical experiences; and (iii) throughout the interaction the user is an active, meaning-making individual choosing how to express themselves-the interpretation responsibility does not lie with the system. We have built several systems that attempt to create affective loop experiences with more or less successful results. For example, eMoto lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background chosen through emotion-gestures with a sensor-enabled stylus pen. Affective Diary is a digital diary with which users can scribble their notes, but it also allows for bodily memorabilia to be recorded from body sensors mapping to users' movement and arousal and placed along a timeline. Users can see patterns in their bodily reactions and relate them to various events going on in their lives. The experiences of building and deploying these systems gave us insights into design requirements for addressing affective loop experiences, such as how to design for turn-taking between user and system, how to create for 'open' surfaces in the design that can carry users' own meaning-making processes, how to combine modalities to create for a 'unity' of expression, and the importance of mirroring user experience in familiar ways that touch upon their everyday social and corporeal experiences. But a more important lesson gained from deploying the systems is how emotion processes are co-constructed and experienced

  8. Optimization of the Nonaqueous Capillary Electrophoresis Separation of Metal Ions Using Mixture Design and Response Surface Methods

    OpenAIRE

    DEMİR, Cevdet; YÜCEL, Yasin

    2014-01-01

    Mixture experimental design was used to enhance the separation selectivity of metal ions in nonaqueous capillary electrophoresis. The separation of cations (Ag, Fe, Cr, Mn, Cd, Co, Pb, Ni, Zn and Cu) was achieved using imidazole as UV co-ion for indirect detection. Acetic acid was chosen as an electrolyte because its cathodic electroosmotic flow permits faster separation. The composition of organic solvents is important to achieve the best separation of all metal ions. Simplex latt...

  9. The Impact of Statistical Leakage Models on Design Yield Estimation

    Directory of Open Access Journals (Sweden)

    Rouwaida Kanj

    2011-01-01

    Full Text Available Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling.

  10. Combined treatment of Thymus vulgaris L., Rosmarinus officinalis L. and Myrtus communis L. essential oils against Salmonella typhimurium: Optimization of antibacterial activity by mixture design methodology.

    Science.gov (United States)

    Fadil, Mouhcine; Fikri-Benbrahim, Kawtar; Rachiq, Saad; Ihssane, Bouchaib; Lebrazi, Sara; Chraibi, Marwa; Haloui, Taoufik; Farah, Abdellah

    2018-05-01

    To increase the sensibility of Salmonella typhimurium strain, a mixture of Thymus vulgaris L. (T. vulgaris L.), Rosmarinus officinalis L. (R. officinalis L.) and Myrtus communis L. (M. communis L.) essential oils (EOs) was used in combined treatment by experimental design methodology (mixture design). The chemical composition of EOs was firstly identified by GC and GC/MS and their antibacterial activity was evaluated. The results of this first step have shown that thymol and borneol were the major compounds in T. vulgaris and M. communis L. EOs, respectively, while 1,8-cineole and α-pinene were found as major compounds in R. officinalis L. The same results have shown a strong antibacterial activity of T. vulgaris L. EO followed by an important power of M. communis L. EO against a moderate activity of R. officinalis L. EO. Besides, 1/20 (v/v) was the concentration giving a strain response classified as sensitive. From this concentration, the mixture design was performed and analyzed. The optimization of mixtures antibacterial activities has highlighted the synergistic effect between T. vulgaris L. and M. communis L. essential oils. A formulation comprising 55% of T. vulgaris L. and 45% of M. communis L. essential oils, respectively, can be considered for the increase of Salmonella typhimurium sensibility. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  12. Dipolar oscillations in a quantum degenerate Fermi-Bose atomic mixture

    International Nuclear Information System (INIS)

    Ferlaino, F; Brecha, R J; Hannaford, P; Riboli, F; Roati, G; Modugno, G; Inguscio, M

    2003-01-01

    We study the dynamics of coupled dipolar oscillations in a Fermi-Bose mixture of 40 K and 87 Rb atoms. This low-energy collective mode is strongly affected by the interspecies interactions. Measurements are performed in the classical and quantum degenerate regimes and reveal the crucial role of the statistical properties of the mixture. At the onset of quantum degeneracy, we investigate the role of Pauli blocking and superfluidity for K and Rb atoms, respectively, resulting in a change in the collisional interactions

  13. A nonlinear isobologram model with Box-Cox transformation to both sides for chemical mixtures.

    Science.gov (United States)

    Chen, D G; Pounds, J G

    1998-12-01

    The linear logistical isobologram is a commonly used and powerful graphical and statistical tool for analyzing the combined effects of simple chemical mixtures. In this paper a nonlinear isobologram model is proposed to analyze the joint action of chemical mixtures for quantitative dose-response relationships. This nonlinear isobologram model incorporates two additional new parameters, Ymin and Ymax, to facilitate analysis of response data that are not constrained between 0 and 1, where parameters Ymin and Ymax represent the minimal and the maximal observed toxic response. This nonlinear isobologram model for binary mixtures can be expressed as [formula: see text] In addition, a Box-Cox transformation to both sides is introduced to improve the goodness of fit and to provide a more robust model for achieving homogeneity and normality of the residuals. Finally, a confidence band is proposed for selected isobols, e.g., the median effective dose, to facilitate graphical and statistical analysis of the isobologram. The versatility of this approach is demonstrated using published data describing the toxicity of the binary mixtures of citrinin and ochratoxin as well as a new experimental data from our laboratory for mixtures of mercury and cadmium.

  14. UX, XD & UXD. User Experience, Experience Design og User Experience Design. 8 paradokser - og 8 forsøg på (op)løsninger. Mod fælles forståelser og definitioner

    DEFF Research Database (Denmark)

    Jensen, Jens F.

    experience, experience design og user experience design. Disse begreber er beslægtede og i nogle sammenhænge tæt sammenvævede, men har dog også separate betydninger. I denne publikations sammenhæng vil vi både tale om user experience, experience design og user experience design som et samlet felt og om de...

  15. The use of particle packing models to design ecological concrete

    NARCIS (Netherlands)

    Fennis, S.A.A.M.; Walraven, J.C.; Den Uijl, J.A.

    2009-01-01

    Ecological concrete can be designed by replacing cement with fillers. With low amounts of cement it becomes increasingly important to control the water demand of concrete mixtures. In this paper a cyclic design method based on particle packing is presented and evaluated on the basis of experiments

  16. Low temperature rheological properties of asphalt mixtures containing different recycled asphalt materials

    Directory of Open Access Journals (Sweden)

    Ki Hoon Moon

    2017-01-01

    Full Text Available Reclaimed Asphalt Pavement (RAP and Recycled Asphalt Shingles (RAS are valuable materials commonly reused in asphalt mixtures due to their economic and environmental benefits. However, the aged binder contained in these materials may negatively affect the low temperature performance of asphalt mixtures. In this paper, the effect of RAP and RAS on low temperature properties of asphalt mixtures is investigated through Bending Beam Rheometer (BBR tests and rheological modeling. First, a set of fourteen asphalt mixtures containing RAP and RAS is prepared and creep stiffness and m-value are experimentally measured. Then, thermal stress is calculated and graphically and statistically compared. The Huet model and the Shift-Homothety-Shift in time-Shift (SHStS transformation, developed at the École Nationale des Travaux Publics de l'État (ENTPE, are used to back calculate the asphalt binder creep stiffness from mixture experimental data. Finally, the model predictions are compared to the creep stiffness of the asphalt binders extracted from each mixture, and the results are analyzed and discussed. It is found that an addition of RAP and RAS beyond 15% and 3%, respectively, significantly change the low temperature properties of asphalt mixture. Differences between back-calculated results and experimental data suggest that blending between new and old binder occurs only partially. Based on the recent finding on diffusion studies, this effect may be associated to mixing and blending processes, to the effective contact between virgin and recycled materials and to the variation of the total virgin-recycled thickness of the binder film which may significantly influence the diffusion process. Keywords: Reclaimed Asphalt Pavement (RAP, Recycled Asphalt Shingles (RAS, Thermal stress, Statistical comparison, Back-calculation, Binder blending

  17. Application of mixture experimental design in formulation and characterization of solid self-nanoemulsifying drug delivery systems containing carbamazepine

    OpenAIRE

    Krstić Marko Z.; Ibrić Svetlana R.

    2016-01-01

    One of the problems with orally used drugs is their poor solubility, which can be overcame by creating solid self-nanoemulsifying drug delivery systems (SNEDDS). Aim is choosing appropriate SNEDDS using mixture design and adsorption of SNEDDS on a solid carrier to improve the dissolution rate of carbamazepine. Self-emulsifying drug delivery systems (SEDDS) consisting of oil phase (caprilic-capric triglycerides), a surfactant (Polisorbat 80 and Labrasol® (1:...

  18. Modeling and experiment to threshing unit of stripper combine ...

    African Journals Online (AJOL)

    Modeling and experiment to threshing unit of stripper combine. ... were conducted with the different feed rates and drum rotator speeds for the rice stripped mixtures. ... and damage as well as for threshing unit design and process optimization.

  19. iCFD: Interpreted Computational Fluid Dynamics – Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design – The secondary clarifier

    DEFF Research Database (Denmark)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat

    2015-01-01

    using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor...... of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization...

  20. Research on the mechanical characteristic of the bentonite mixture material under the groundwater environment of Horonobe. 2

    International Nuclear Information System (INIS)

    Takaji, Kazuhiko; Shigeno, Yoshimasa; Simogouchi, Takafumi

    2005-02-01

    In the Horonobe underground research project, various in-situ experiments are conducted in order to confirm the applicability of the Engineered Barrier System (EBS) design techniques shown in H12 report, to understand the long-term effects of EBS, and to improve the reliability of the prediction method. Moreover, since it is assumed that the circumference of Horonobe underground research laboratory is the saline water environment, to understand the mechanical behavior of the bentonite mixture material under the saline water environment is important when influenced in design of in-situ experiments. In this study, unconfined compression tests, consolidated-undrained triaxial compression tests and long-term consolidation tests of the bentonite mixture material were performed using groundwater that is extracted near the Horonobe underground research laboratory, and simulation analyses of EBS over a period of time using the results of laboratory experiments etc. were carried out. Consequently, although compressive strength and the elastic modulus under the saline water environment declined compared with that the fresh water, neither shear deformation behavior under triaxial stress condition nor volume deformation behavior by consolidation test almost had a difference, and it was suggested that there were few possibilities that the saline water had serious influence mechanically also about long-term mechanical behavior. (author)

  1. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry

    2016-11-28

    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  2. Safety Research Experiment Facility project. Conceptual design report. Volume IX. Experiment handling

    International Nuclear Information System (INIS)

    1975-01-01

    Information on the SAREF Reactor experiment handling system is presented concerning functions and design requirements, design description, operation, casualty events and recovery procedures, and maintenance

  3. Evaluation of Thermodynamic Models for Predicting Phase Equilibria of CO2 + Impurity Binary Mixture

    Science.gov (United States)

    Shin, Byeong Soo; Rho, Won Gu; You, Seong-Sik; Kang, Jeong Won; Lee, Chul Soo

    2018-03-01

    For the design and operation of CO2 capture and storage (CCS) processes, equation of state (EoS) models are used for phase equilibrium calculations. Reliability of an EoS model plays a crucial role, and many variations of EoS models have been reported and continue to be published. The prediction of phase equilibria for CO2 mixtures containing SO2, N2, NO, H2, O2, CH4, H2S, Ar, and H2O is important for CO2 transportation because the captured gas normally contains small amounts of impurities even though it is purified in advance. For the design of pipelines in deep sea or arctic conditions, flow assurance and safety are considered priority issues, and highly reliable calculations are required. In this work, predictive Soave-Redlich-Kwong, cubic plus association, Groupe Européen de Recherches Gazières (GERG-2008), perturbed-chain statistical associating fluid theory, and non-random lattice fluids hydrogen bond EoS models were compared regarding performance in calculating phase equilibria of CO2-impurity binary mixtures and with the collected literature data. No single EoS could cover the entire range of systems considered in this study. Weaknesses and strong points of each EoS model were analyzed, and recommendations are given as guidelines for safe design and operation of CCS processes.

  4. Statistical controversies in clinical research: requiem for the 3 + 3 design for phase I trials.

    Science.gov (United States)

    Paoletti, X; Ezzalfani, M; Le Tourneau, C

    2015-09-01

    More than 95% of published phase I trials have used the 3 + 3 design to identify the dose to be recommended for phase II trials. However, the statistical community agrees on the limitations of the 3 + 3 design compared with model-based approaches. Moreover, the mechanisms of action of targeted agents strongly challenge the hypothesis that the maximum tolerated dose constitutes the optimal dose, and more outcomes including clinical and biological activity increasingly need to be taken into account to identify the optimal dose. We review key elements from clinical publications and from the statistical literature to show that the 3 + 3 design lacks the necessary flexibility to address the challenges of targeted agents. The design issues raised by expansion cohorts, new definitions of dose-limiting toxicity and trials of combinations are not easily addressed by the 3 + 3 design or its extensions. Alternative statistical proposals have been developed to make a better use of the complex data generated by phase I trials. Their applications require a close collaboration between all actors of early phase clinical trials. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  5. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies

    DEFF Research Database (Denmark)

    Thompson, Wesley K.; Wang, Yunpeng; Schork, Andrew J.

    2015-01-01

    -wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via...... analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn’s disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While...... minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local...

  6. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these

  7. SYSTEMATIC DESIGNING IN ARCHITECTURAL EDUCATION: AN EXPERIENCE OF HOSPITAL DESIGN

    Directory of Open Access Journals (Sweden)

    Dicle AYDIN

    2010-07-01

    Full Text Available Architectural design is defined as decision-making process. Design studios play an important role in experiencing this process and provide the competence of design to prospective architects. The instructors of architecture aim to compel the imagination of the students develop creative thinking, raising the awareness among students about their abilities. Furthermore, executives of the studios pay attention to delimitative elements in design in order to provide the competence of problem solving for students. Each experience in education period prepares the prospective architects for the social environment and the realities of the future. The aim of the study is to examine a practicing in architectural education. The general hospital project was carried out with 40 students and 4 project executives within the 2007-2008 academic year Spring Semester Studio-7 courses. The steps followed in the studio process were analyzed with the design problem of “hospital”. Evaluations were performed on; the solution of functional-spatial organization, solutions about the activities of the users, convenience with the standards and regulations and prosperity-aesthetic notions in internal space. Prospective architects generally became successful in the design of hospital building with complex function. This experience raised awareness about access to information via thinking, provision of a new position for information in each concept.

  8. Optimal experiment design for magnetic resonance fingerprinting.

    Science.gov (United States)

    Bo Zhao; Haldar, Justin P; Setsompop, Kawin; Wald, Lawrence L

    2016-08-01

    Magnetic resonance (MR) fingerprinting is an emerging quantitative MR imaging technique that simultaneously acquires multiple tissue parameters in an efficient experiment. In this work, we present an estimation-theoretic framework to evaluate and design MR fingerprinting experiments. More specifically, we derive the Cramér-Rao bound (CRB), a lower bound on the covariance of any unbiased estimator, to characterize parameter estimation for MR fingerprinting. We then formulate an optimal experiment design problem based on the CRB to choose a set of acquisition parameters (e.g., flip angles and/or repetition times) that maximizes the signal-to-noise ratio efficiency of the resulting experiment. The utility of the proposed approach is validated by numerical studies. Representative results demonstrate that the optimized experiments allow for substantial reduction in the length of an MR fingerprinting acquisition, and substantial improvement in parameter estimation performance.

  9. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  10. Electron drift velocities of Ar-CO2-CF4 gas mixtures

    International Nuclear Information System (INIS)

    Markeloff, R.

    1994-11-01

    The muon spectrometer for the D0 experiment at Fermi National Accelerator Laboratory uses proportional drift tubes filled with an Ar-CO 2 -CF 4 gas mixture. Measurements of drift velocity as a function of electric field magnitude for 90%-5%-5% and 90%-4%-6% Ar-CO 2 -CF 4 mixtures are presented, and our operational experiences with these gases at D0 is discussed

  11. Directions for new developments on statistical design and analysis of small population group trials.

    Science.gov (United States)

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small

  12. Text document classification based on mixture models

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Malík, Antonín

    2004-01-01

    Roč. 40, č. 3 (2004), s. 293-304 ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004

  13. The influence of narrative v. statistical information on perceiving vaccination risks.

    Science.gov (United States)

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  14. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  15. The effect of non-condensable gas on direct contact condensation of steam/air mixture

    International Nuclear Information System (INIS)

    Lee, H. C.; Park, S. K.; Kim, M. H.

    1998-01-01

    To investigate the effects of noncondensable gas on the direct contact film condensation of vapor mixture, a series of experiments has been carried out. The rectangular duct inclined 87.deg. to the horizontal plane was used for this experiment. The average heat transfer coefficient of the steam-air mixture was obtained at the atmospheric pressure with four main parameters, air-mass fraction, vapor velocity, film Reynolds number,and the degree of water film subcooling having an influence on the condensation heat transfer coefficient. With the analysis on 88 cases of experiments, a correlation of the average Nusselt number for direct contact film condensation of steam-air mixture at a vertical wall proposed as functions of film Reynolds number, mixture Reynolds number, air mass fraction, and Jacob number. The average heat transfer coefficient for steam-air mixture condensation decreased significantly while air mass fraction increases with the same inlet mixture velocity and inlet film temperature. The average heat transfer coefficients also decreased with the degree of film subcooling increasing and were scarcely affected by film Reynolds number below the mixture Reynolds number about 30,000

  16. Performance Analysis of Joule-Thomson Cooler Supplied with Gas Mixtures

    Science.gov (United States)

    Piotrowska, A.; Chorowski, M.; Dorosz, P.

    2017-02-01

    Joule-Thomson (J-T) cryo-coolers working in closed cycles and supplied with gas mixtures are the subject of intensive research in different laboratories. The replacement of pure nitrogen by nitrogen-hydrocarbon mixtures allows to improve both thermodynamic parameters and economy of the refrigerators. It is possible to avoid high pressures in the heat exchanger and to use standard refrigeration compressor instead of gas bottles or high-pressure oil free compressor. Closed cycle and mixture filled Joule-Thomson cryogenic refrigerator providing 10-20 W of cooling power at temperature range 90-100 K has been designed and manufactured. Thermodynamic analysis including the optimization of the cryo-cooler mixture has been performed with ASPEN HYSYS software. The paper describes the design of the cryo-cooler and provides thermodynamic analysis of the system. The test results are presented and discussed.

  17. Interim Service ISDN Satellite (ISIS) hardware experiment design for advanced ISDN satellite design and experiments

    Science.gov (United States)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Design for Advanced Satellite Designs describes the design of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into time division multiple access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the V.35 interface for satellite uplink. The same ISTA converts in the opposite direction the V.35 to U-interface data with a simple switch setting.

  18. Infinite von Mises-Fisher Mixture Modeling of Whole Brain fMRI Data

    DEFF Research Database (Denmark)

    Røge, Rasmus; Madsen, Kristoffer Hougaard; Schmidt, Mikkel Nørgaard

    2017-01-01

    spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises-Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain...... Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians......Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying...

  19. Design issues in toxicogenomics using DNA microarray experiment

    International Nuclear Information System (INIS)

    Lee, Kyoung-Mu; Kim, Ju-Han; Kang, Daehee

    2005-01-01

    The methods of toxicogenomics might be classified into omics study (e.g., genomics, proteomics, and metabolomics) and population study focusing on risk assessment and gene-environment interaction. In omics study, microarray is the most popular approach. Genes falling into several categories (e.g., xenobiotics metabolism, cell cycle control, DNA repair etc.) can be selected up to 20,000 according to a priori hypothesis. The appropriate type of samples and species should be selected in advance. Multiple doses and varied exposure durations are suggested to identify those genes clearly linked to toxic response. Microarray experiments can be affected by numerous nuisance variables including experimental designs, sample extraction, type of scanners, etc. The number of slides might be determined from the magnitude and variance of expression change, false-positive rate, and desired power. Instead, pooling samples is an alternative. Online databases on chemicals with known exposure-disease outcomes and genetic information can aid the interpretation of the normalized results. Gene function can be inferred from microarray data analyzed by bioinformatics methods such as cluster analysis. The population study often adopts hospital-based or nested case-control design. Biases in subject selection and exposure assessment should be minimized, and confounding bias should also be controlled for in stratified or multiple regression analysis. Optimal sample sizes are dependent on the statistical test for gene-to-environment or gene-to-gene interaction. The design issues addressed in this mini-review are crucial in conducting toxicogenomics study. In addition, integrative approach of exposure assessment, epidemiology, and clinical trial is required

  20. Asphalt Mixture for the First Asphalt Concrete Directly Fastened Track in Korea

    Directory of Open Access Journals (Sweden)

    Seong-Hyeok Lee

    2015-01-01

    Full Text Available The research has been initiated to develop the asphalt mixtures which are suitable for the surface of asphalt concrete directly fastened track (ADFT system and evaluate the performance of the asphalt mixture. Three aggregate gradations which are upper (finer, medium, and below (coarser. The nominal maximum aggregate size of asphalt mixture was 10 mm. Asphalt mixture design was conducted at 3 percent air voids using Marshall mix design method. To make impermeable asphalt mixture surface, the laboratory permeability test was conducted for asphalt mixtures of three different aggregate gradations using asphalt mixture permeability tester. Moisture susceptibility test was conducted based on AASHTO T 283. The stripping percentage of asphalt mixtures was measured using a digital camera and analyzed based on image analysis techniques. Based on the limited research results, the finer aggregate gradation is the most suitable for asphalt mixture for ADFT system with the high TSR value and the low stripping percentage and permeable coefficient. Flow number and beam fatigue tests for finer aggregate asphalt mixture were conducted to characterize the performance of asphalt mixtures containing two modified asphalt binders: STE-10 which is styrene-butadiene-styrene (SBS polymer and ARMA which is Crum rubber modified asphalt. The performance tests indicate that the STE-10 shows the higher rutting life and fatigue life.