WorldWideScience

Sample records for statistical experimental design

  1. Experimental toxicology: Issues of statistics, experimental design, and replication.

    Science.gov (United States)

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  3. Fundamentals of statistical experimental design and analysis

    CERN Document Server

    Easterling, Robert G

    2015-01-01

    Professionals in all areas - business; government; the physical, life, and social sciences; engineering; medicine, etc. - benefit from using statistical experimental design to better understand their worlds and then use that understanding to improve the products, processes, and programs they are responsible for. This book aims to provide the practitioners of tomorrow with a memorable, easy to read, engaging guide to statistics and experimental design. This book uses examples, drawn from a variety of established texts, and embeds them in a business or scientific context, seasoned with a dash of humor, to emphasize the issues and ideas that led to the experiment and the what-do-we-do-next? steps after the experiment. Graphical data displays are emphasized as means of discovery and communication and formulas are minimized, with a focus on interpreting the results that software produce. The role of subject-matter knowledge, and passion, is also illustrated. The examples do not require specialized knowledge, and t...

  4. Statistical experimental design for refractory coatings

    International Nuclear Information System (INIS)

    McKinnon, J.A.; Standard, O.C.

    2000-01-01

    The production of refractory coatings on metal casting moulds is critically dependent on the development of suitable rheological characteristics, such as viscosity and thixotropy, in the initial coating slurry. In this paper, the basic concepts of mixture design and analysis are applied to the formulation of a refractory coating, with illustration by a worked example. Experimental data of coating viscosity versus composition are fitted to a statistical model to obtain a reliable method of predicting the optimal formulation of the coating. Copyright (2000) The Australian Ceramic Society

  5. Statistical experimental design for saltstone mixtures

    International Nuclear Information System (INIS)

    Harris, S.P.; Postles, R.L.

    1992-01-01

    The authors used a mixture experimental design for determining a window of operability for a process at the U.S. Department of Energy, Savannah River Site, Defense Waste Processing Facility (DWPF). The high-level radioactive waste at the Savannah River Site is stored in large underground carbon steel tanks. The waste consists of a supernate layer and a sludge layer. Cesium-137 will be removed from the supernate by precipitation and filtration. After further processing, the supernate layer will be fixed as a grout for disposal in concrete vaults. The remaining precipitate will be processed at the DWPF with treated waste tank sludge and glass-making chemicals into borosilicate glass. The leach-rate properties of the supernate grout formed from various mixes of solidified coefficients for NO 3 and chromium were used as a measure of leach rate. Various mixes of cement, Ca(OH) 2 , salt, slag, and fly ash were used. These constituents comprise the whole mix. Thus, a mixture experimental design was used. The regression procedure (PROC REG) in SAS was used to produce analysis of variance (ANOVA) statistics. In addition, detailed model diagnostics are readily available for identifying suspicious observations. For convenience, trillinear contour (TLC) plots, a standard graphics tool for examining mixture response surfaces, of the fitted model were produced using ECHIP

  6. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  7. Optimal experimental design with R

    CERN Document Server

    Rasch, Dieter; Verdooren, L R; Gebhardt, Albrecht

    2011-01-01

    Experimental design is often overlooked in the literature of applied and mathematical statistics: statistics is taught and understood as merely a collection of methods for analyzing data. Consequently, experimenters seldom think about optimal design, including prerequisites such as the necessary sample size needed for a precise answer for an experimental question. Providing a concise introduction to experimental design theory, Optimal Experimental Design with R: Introduces the philosophy of experimental design Provides an easy process for constructing experimental designs and calculating necessary sample size using R programs Teaches by example using a custom made R program package: OPDOE Consisting of detailed, data-rich examples, this book introduces experimenters to the philosophy of experimentation, experimental design, and data collection. It gives researchers and statisticians guidance in the construction of optimum experimental designs using R programs, including sample size calculations, hypothesis te...

  8. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    Science.gov (United States)

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  9. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  10. Experimental design matters for statistical analysis

    DEFF Research Database (Denmark)

    Jensen, Signe Marie; Schaarschmidt, Frank; Onofri, Andrea

    2018-01-01

    , the experimental design is often more or less neglected when analyzing data. Two data examples were analyzed using different modelling strategies: Firstly, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Secondly, translocation...... of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. RESULTS: It was shown that results from sub...

  11. Statistical experimental design for saltstone mixtures

    International Nuclear Information System (INIS)

    Harris, S.P.; Postles, R.L.

    1991-01-01

    We used a mixture experimental design for determining a window of operability for a process at the Savannah River Site Defense Waste Processing Facility (DWPF). The high-level radioactive waste at the Savannah River Site is stored in large underground carbon steel tanks. The waste consists of a supernate layer and a sludge layer. 137 Cs will be removed from the supernate by precipitation and filtration. After further processing, the supernate layer will be fixed as a grout for disposal in concrete vaults. The remaining precipitate will be processed at the DWPF with treated waste tank sludge and glass-making chemicals into borosilicate glass. The leach rate properties of the supernate grout, formed from various mixes of solidified salt waste, needed to be determined. The effective diffusion coefficients for NO 3 and Cr were used as a measure of leach rate. Various mixes of cement, Ca(OH) 2 , salt, slag and flyash were used. These constituents comprise the whole mix. Thus, a mixture experimental design was used

  12. A statistical approach to the experimental design of the sulfuric acid leaching of gold-copper ore

    Directory of Open Access Journals (Sweden)

    Mendes F.D.

    2003-01-01

    Full Text Available The high grade of copper in the Igarapé Bahia (Brazil gold-copper ore prevents the direct application of the classic cyanidation process. Copper oxides and sulfides react with cyanides in solution, causing a high consumption of leach reagent and thereby raising processing costs and decreasing recovery of gold. Studies have showm that a feasible route for this ore would be a pretreatment for copper minerals removal prior to the cyanidation stage. The goal of this experimental work was to study the experimental conditions required for copper removal from Igarapé Bahia gold-copper ore by sulfuric acid leaching by applying a statistical approach to the experimental design. By using the Plackett Burman method, it was possible to select the variables that had the largest influence on the percentage of copper extracted at the sulfuric acid leaching stage. These were temperature of leach solution, stirring speed, concentration of sulfuric acid in the leach solution and particle size of the ore. The influence of the individual effects of these variables and their interactions on the experimental response were analyzed by applying the replicated full factorial design method. Finally, the selected variables were optimized by the ascending path statistical method, which determined the best experimental conditions for leaching to achieve the highest percentage of copper extracted. Using the optimized conditions, the best leaching results showed a copper extraction of 75.5%.

  13. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  14. Experimental design of a waste glass study

    International Nuclear Information System (INIS)

    Piepel, G.F.; Redgate, P.E.; Hrma, P.

    1995-04-01

    A Composition Variation Study (CVS) is being performed to support a future high-level waste glass plant at Hanford. A total of 147 glasses, covering a broad region of compositions melting at approximately 1150 degrees C, were tested in five statistically designed experimental phases. This paper focuses on the goals, strategies, and techniques used in designing the five phases. The overall strategy was to investigate glass compositions on the boundary and interior of an experimental region defined by single- component, multiple-component, and property constraints. Statistical optimal experimental design techniques were used to cover various subregions of the experimental region in each phase. Empirical mixture models for glass properties (as functions of glass composition) from previous phases wee used in designing subsequent CVS phases

  15. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  16. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  17. Experimental design in chemistry: A tutorial.

    Science.gov (United States)

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  18. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  19. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  20. A Statistical Approach to Optimizing Concrete Mixture Design

    OpenAIRE

    Ahmad, Shamsad; Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicate...

  1. A statistical approach to optimizing concrete mixture design.

    Science.gov (United States)

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  2. A Statistical Approach to Optimizing Concrete Mixture Design

    Directory of Open Access Journals (Sweden)

    Shamsad Ahmad

    2014-01-01

    Full Text Available A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33. A total of 27 concrete mixtures with three replicates (81 specimens were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48, cementitious materials content (350, 375, and 400 kg/m3, and fine/total aggregate ratio (0.35, 0.40, and 0.45. The experimental data were utilized to carry out analysis of variance (ANOVA and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  3. Statistical guidance for experimental design and data analysis of mutation detection in rare monogenic mendelian diseases by exome sequencing.

    Directory of Open Access Journals (Sweden)

    Degui Zhi

    Full Text Available Recently, whole-genome sequencing, especially exome sequencing, has successfully led to the identification of causal mutations for rare monogenic Mendelian diseases. However, it is unclear whether this approach can be generalized and effectively applied to other Mendelian diseases with high locus heterogeneity. Moreover, the current exome sequencing approach has limitations such as false positive and false negative rates of mutation detection due to sequencing errors and other artifacts, but the impact of these limitations on experimental design has not been systematically analyzed. To address these questions, we present a statistical modeling framework to calculate the power, the probability of identifying truly disease-causing genes, under various inheritance models and experimental conditions, providing guidance for both proper experimental design and data analysis. Based on our model, we found that the exome sequencing approach is well-powered for mutation detection in recessive, but not dominant, Mendelian diseases with high locus heterogeneity. A disease gene responsible for as low as 5% of the disease population can be readily identified by sequencing just 200 unrelated patients. Based on these results, for identifying rare Mendelian disease genes, we propose that a viable approach is to combine, sequence, and analyze patients with the same disease together, leveraging the statistical framework presented in this work.

  4. Optimal Design and Related Areas in Optimization and Statistics

    CERN Document Server

    Pronzato, Luc

    2009-01-01

    This edited volume, dedicated to Henry P. Wynn, reflects his broad range of research interests, focusing in particular on the applications of optimal design theory in optimization and statistics. It covers algorithms for constructing optimal experimental designs, general gradient-type algorithms for convex optimization, majorization and stochastic ordering, algebraic statistics, Bayesian networks and nonlinear regression. Written by leading specialists in the field, each chapter contains a survey of the existing literature along with substantial new material. This work will appeal to both the

  5. Optimization of phototrophic hydrogen production by Rhodopseudomonas palustris PBUM001 via statistical experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, Zadariana [Department of Civil Engineering, Faculty of Engineering, University of Malaya (Malaysia); Faculty of Civil Engineering, Technology University of MARA (Malaysia); Mohamad Annuar, Mohamad Suffian; Vikineswary, S. [Institute of Biological Sciences, University of Malaya (Malaysia); Ibrahim, Shaliza [Department of Civil Engineering, Faculty of Engineering, University of Malaya (Malaysia)

    2009-09-15

    Phototrophic hydrogen production by indigenous purple non-sulfur bacteria, Rhodopseudomonas palustris PBUM001 from palm oil mill effluent (POME) was optimized using response surface methodology (RSM). The process parameters studied include inoculum sizes (% v/v), POME concentration (% v/v), light intensity (klux), agitation (rpm) and pH. The experimental data on cumulative hydrogen production and COD reduction were fitted into a quadratic polynomial model using response surface regression analysis. The path to optimal process conditions was determined by analyzing response surface three-dimensional surface plot and contour plot. Statistical analysis on experimental data collected following Box-Behnken design showed that 100% (v/v) POME concentration, 10% (v/v) inoculum size, light intensity at 4.0 klux, agitation rate at 250 rpm and pH of 6 were the best conditions. The maximum predicted cumulative hydrogen production and COD reduction obtained under these conditions was 1.05 ml H{sub 2}/ml POME and 31.71% respectively. Subsequent verification experiments at optimal process values gave the maximum yield of cumulative hydrogen at 0.66 {+-} 0.07 ml H{sub 2}/ml POME and COD reduction at 30.54 {+-} 9.85%. (author)

  6. Two polynomial representations of experimental design

    OpenAIRE

    Notari, Roberto; Riccomagno, Eva; Rogantin, Maria-Piera

    2007-01-01

    In the context of algebraic statistics an experimental design is described by a set of polynomials called the design ideal. This, in turn, is generated by finite sets of polynomials. Two types of generating sets are mostly used in the literature: Groebner bases and indicator functions. We briefly describe them both, how they are used in the analysis and planning of a design and how to switch between them. Examples include fractions of full factorial designs and designs for mixture experiments.

  7. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

    Science.gov (United States)

    González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

    2007-06-07

    In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

  8. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    Science.gov (United States)

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  9. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  10. Bayesian optimal experimental design for the Shock-tube experiment

    International Nuclear Information System (INIS)

    Terejanu, G; Bryant, C M; Miki, K

    2013-01-01

    The sequential optimal experimental design formulated as an information-theoretic sensitivity analysis is applied to the ignition delay problem using real experimental. The optimal design is obtained by maximizing the statistical dependence between the model parameters and observables, which is quantified in this study using mutual information. This is naturally posed in the Bayesian framework. The study shows that by monitoring the information gain after each measurement update, one can design a stopping criteria for the experimental process which gives a minimal set of experiments to efficiently learn the Arrhenius parameters.

  11. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  12. Optimal Experimental Design for Model Discrimination

    Science.gov (United States)

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  13. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  14. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  15. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    Science.gov (United States)

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  16. Scientific, statistical, practical, and regulatory considerations in design space development.

    Science.gov (United States)

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  17. Aplication of the statistical experimental design to optimize mine-impacted water (MIW) remediation using shrimp-shell.

    Science.gov (United States)

    Núñez-Gómez, Dámaris; Alves, Alcione Aparecida de Almeida; Lapolli, Flavio Rubens; Lobo-Recio, María A

    2017-01-01

    Mine-impacted water (MIW) is one of the most serious mining problems and has a high negative impact on water resources and aquatic life. The main characteristics of MIW are a low pH (between 2 and 4) and high concentrations of SO 4 2- and metal ions (Cd, Cu, Ni, Pb, Zn, Fe, Al, Cr, Mn, Mg, etc.), many of which are toxic to ecosystems and human life. Shrimp shell was selected as a MIW treatment agent because it is a low-cost metal-sorbent biopolymer with a high chitin content and contains calcium carbonate, an acid-neutralizing agent. To determine the best metal-removal conditions, a statistical study using statistical planning was carried out. Thus, the objective of this work was to identify the degree of influence and dependence of the shrimp-shell content for the removal of Fe, Al, Mn, Co, and Ni from MIW. In this study, a central composite rotational experimental design (CCRD) with a quadruplicate at the midpoint (2 2 ) was used to evaluate the joint influence of two formulation variables-agitation and the shrimp-shell content. The statistical results showed the significant influence (p < 0.05) of the agitation variable for Fe and Ni removal (linear and quadratic form, respectively) and of the shrimp-shell content variable for Mn (linear form), Al and Co (linear and quadratic form) removal. Analysis of variance (ANOVA) for Al, Co, and Ni removal showed that the model is valid at the 95% confidence interval and that no adjustment needed within the ranges evaluated of agitation (0-251.5 rpm) and shrimp-shell content (1.2-12.8 g L -1 ). The model required adjustments to the 90% and 75% confidence interval for Fe and Mn removal, respectively. In terms of efficiency in removing pollutants, it was possible to determine the best experimental values of the variables considered as 188 rpm and 9.36 g L -1 of shrimp-shells. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Applied statistical designs for the researcher

    CERN Document Server

    Paulson, Daryl S

    2003-01-01

    Research and Statistics Basic Review of Parametric Statistics Exploratory Data Analysis Two Sample Tests Completely Randomized One-Factor Analysis of Variance One and Two Restrictions on Randomization Completely Randomized Two-Factor Factorial Designs Two-Factor Factorial Completely Randomized Blocked Designs Useful Small Scale Pilot Designs Nested Statistical Designs Linear Regression Nonparametric Statistics Introduction to Research Synthesis and "Meta-Analysis" and Conclusory Remarks References Index.

  19. Statistical reporting inconsistencies in experimental philosophy.

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science.

  20. Statistical reporting inconsistencies in experimental philosophy

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220

  1. Design and performance characteristics of solar adsorption refrigeration system using parabolic trough collector: Experimental and statistical optimization technique

    International Nuclear Information System (INIS)

    Abu-Hamdeh, Nidal H.; Alnefaie, Khaled A.; Almitani, Khalid H.

    2013-01-01

    Highlights: • The successes of using olive waste/methanol as an adsorbent/adsorbate pair. • The experimental gross cycle coefficient of performance obtained was COP a = 0.75. • Optimization showed expanding adsorbent mass to a certain range increases the COP. • The statistical optimization led to optimum tank volume between 0.2 and 0.3 m 3 . • Increasing the collector area to a certain range increased the COP. - Abstract: The current work demonstrates a developed model of a solar adsorption refrigeration system with specific requirements and specifications. The recent scheme can be employed as a refrigerator and cooler unit suitable for remote areas. The unit runs through a parabolic trough solar collector (PTC) and uses olive waste as adsorbent with methanol as adsorbate. Cooling production, COP (coefficient of performance, and COP a (cycle gross coefficient of performance) were used to assess the system performance. The system’s design optimum parameters in this study were arrived to through statistical and experimental methods. The lowest temperature attained in the refrigerated space was 4 °C and the equivalent ambient temperature was 27 °C. The temperature started to decrease steadily at 20:30 – when the actual cooling started – until it reached 4 °C at 01:30 in the next day when it rose again. The highest COP a obtained was 0.75

  2. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  3. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    Science.gov (United States)

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  4. Statistical core design methodology using the VIPRE thermal-hydraulics code

    International Nuclear Information System (INIS)

    Lloyd, M.W.; Feltus, M.A.

    1995-01-01

    An improved statistical core design methodology for developing a computational departure from nucleate boiling ratio (DNBR) correlation has been developed and applied in order to analyze the nominal 1.3 DNBR limit on Westinghouse Pressurized Water Reactor (PWR) cores. This analysis, although limited in scope, found that the DNBR limit can be reduced from 1.3 to some lower value and be accurate within an adequate confidence level of 95%, for three particular FSAR operational transients: turbine trip, complete loss of flow, and inadvertent opening of a pressurizer relief valve. The VIPRE-01 thermal-hydraulics code, the SAS/STAT statistical package, and the EPRI/Columbia University DNBR experimental data base were used in this research to develop the Pennsylvania State Statistical Core Design Methodology (PSSCDM). The VIPRE code was used to perform the necessary sensitivity studies and generate the EPRI correlation-calculated DNBR predictions. The SAS package used for these EPRI DNBR correlation predictions from VIPRE as a data set to determine the best fit for the empirical model and to perform the statistical analysis. (author)

  5. Statistical Multipath Model Based on Experimental GNSS Data in Static Urban Canyon Environment

    Directory of Open Access Journals (Sweden)

    Yuze Wang

    2018-04-01

    Full Text Available A deep understanding of multipath characteristics is essential to design signal simulators and receivers in global navigation satellite system applications. As a new constellation is deployed and more applications occur in the urban environment, the statistical multipath models of navigation signal need further study. In this paper, we present statistical distribution models of multipath time delay, multipath power attenuation, and multipath fading frequency based on the experimental data in the urban canyon environment. The raw data of multipath characteristics are obtained by processing real navigation signal to study the statistical distribution. By fitting the statistical data, it shows that the probability distribution of time delay follows a gamma distribution which is related to the waiting time of Poisson distributed events. The fading frequency follows an exponential distribution, and the mean of multipath power attenuation decreases linearly with an increasing time delay. In addition, the detailed statistical characteristics for different elevations and orbits satellites is studied, and the parameters of each distribution are quite different. The research results give useful guidance for navigation simulator and receiver designers.

  6. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2016-08-31

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.

  7. Statistical analysis of sonochemical synthesis of SAPO-34 nanocrystals using Taguchi experimental design

    International Nuclear Information System (INIS)

    Askari, Sima; Halladj, Rouein; Nazari, Mahdi

    2013-01-01

    Highlights: ► Sonochemical synthesis of SAPO-34 nanocrystals. ► Using Taguchi experimental design (L9) for optimizing the experimental procedure. ► The significant effects of all the ultrasonic parameters on the response. - Abstract: SAPO-34 nanocrystals with high crystallinity were synthesized by means of sonochemical method. An L9 orthogonal array of the Taguchi method was implemented to investigate the effects of sonication conditions on the preparation of SAPO-34 with respect to crystallinity of the final product phase. The experimental data establish the favorable phase crystallinity which is improved by increasing the ultrasonic power and the sonication temperature. In the case of ultrasonic irradiation time, however, an initial increases in crystallinity from 5 min to 15 min is followed by a decrease in crystallinity for longer sonication time

  8. Experimental design and quantitative analysis of microbial community multiomics.

    Science.gov (United States)

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  9. Statistical considerations of graphite strength for assessing design allowable stresses

    International Nuclear Information System (INIS)

    Ishihara, M.; Mogi, H.; Ioka, I.; Arai, T.; Oku, T.

    1987-01-01

    Several aspects of statistics need to be considered to determine design allowable stresses for graphite structures. These include: 1) Statistical variation of graphite material strength. 2) Uncertainty of calculated stress. 3) Reliability (survival probability) required from operational and safety performance of graphite structures. This paper deals with some statistical considerations of structural graphite for assessing design allowable stress. Firstly, probability distribution functions of tensile and compressive strengths are investigated on experimental Very High Temperature candidated graphites. Normal, logarithmic normal and Weibull distribution functions are compared in terms of coefficient of correlation to measured strength data. This leads to the adaptation of normal distribution function. Then, the relation between factor of safety and fracture probability is discussed on the following items: 1) As the graphite strength is more variable than metalic material's strength, the effect of strength variation to the fracture probability is evaluated. 2) Fracture probability depending on survival probability of 99 ∼ 99.9 (%) with confidence level of 90 ∼ 95 (%) is discussed. 3) As the material properties used in the design analysis are usually the mean values of their variation, the additional effect of these variations on the fracture probability is discussed. Finally, the way to assure the minimum ultimate strength with required survival probability with confidence level is discussed in view of statistical treatment of the strength data from varying sample numbers in a material acceptance test. (author)

  10. Procedure for statistical analysis of one-parameter discrepant experimental data

    International Nuclear Information System (INIS)

    Badikov, Sergey A.; Chechev, Valery P.

    2012-01-01

    A new, Mandel–Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. - Highlights: ► A new statistical procedure for processing one-parametric discrepant experimental data has been presented. ► Procedure estimates a contribution of unrecognized errors in the total experimental uncertainty. ► Procedure was applied for processing half-life discrepant experimental data. ► Results of the calculations are compared to the ENSDF and DDEP evaluations.

  11. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    Science.gov (United States)

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  12. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  13. Constructing experimental designs for discrete-choice experiments: report of the ISPOR Conjoint Analysis Experimental Design Good Research Practices Task Force.

    Science.gov (United States)

    Reed Johnson, F; Lancsar, Emily; Marshall, Deborah; Kilambi, Vikram; Mühlbacher, Axel; Regier, Dean A; Bresnahan, Brian W; Kanninen, Barbara; Bridges, John F P

    2013-01-01

    Stated-preference methods are a class of evaluation techniques for studying the preferences of patients and other stakeholders. While these methods span a variety of techniques, conjoint-analysis methods-and particularly discrete-choice experiments (DCEs)-have become the most frequently applied approach in health care in recent years. Experimental design is an important stage in the development of such methods, but establishing a consensus on standards is hampered by lack of understanding of available techniques and software. This report builds on the previous ISPOR Conjoint Analysis Task Force Report: Conjoint Analysis Applications in Health-A Checklist: A Report of the ISPOR Good Research Practices for Conjoint Analysis Task Force. This report aims to assist researchers specifically in evaluating alternative approaches to experimental design, a difficult and important element of successful DCEs. While this report does not endorse any specific approach, it does provide a guide for choosing an approach that is appropriate for a particular study. In particular, it provides an overview of the role of experimental designs for the successful implementation of the DCE approach in health care studies, and it provides researchers with an introduction to constructing experimental designs on the basis of study objectives and the statistical model researchers have selected for the study. The report outlines the theoretical requirements for designs that identify choice-model preference parameters and summarizes and compares a number of available approaches for constructing experimental designs. The task-force leadership group met via bimonthly teleconferences and in person at ISPOR meetings in the United States and Europe. An international group of experimental-design experts was consulted during this process to discuss existing approaches for experimental design and to review the task force's draft reports. In addition, ISPOR members contributed to developing a consensus

  14. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  15. Optimization of aspergillus niger nutritional conditions using statistical experimental methods for bio-recovery of manganese from pyrolusite

    International Nuclear Information System (INIS)

    Mujeeb-ur-Rahman; Yasinzai, M.M.; Tareen, R.B.; Iqbal, A.; Gul, S.; Odhano, E.A.

    2011-01-01

    Optimization of aspergillus niger nutritional conditions using statistical experimental methods for bio-recovery of manganese from pyrolusite Mujeeb-ur-rahman, Mohammed Masoom Yasinzai, Rasool Bakhsh Tareen, Asim Iqbal, Ejaz Ali Odhano, Shereen Gul. The nutritional requirements for Aspergillus niger PCSIR-06 for bio-recovery of manganese from pyrolusite ore were optimized. Box-Bhenken design and response surface methodology were used for designing of experiment and statistical analysis of the results. This procedure limited the number of actual experiments to 54 for studying the possible interaction between six nutrients. The optimum concentration of the nutrients were Sucrose 148.5 g/L, KH/sub 2/PO/sub 4/ 0.50 g/L, NH/sub 4/NO/sub 3/ 0.33 g/L, MgSO/sub 4/ 0.41 g/L, Zn 23.76 mg/L, Fe 0.18 mg/L for Aspergillus niger to achieve maximum bio-recovery of manganese (82.47 +- 5.67%). The verification run confirmed the predicted optimized concentration of all the six ingredients for maximum bio leaching of manganese and successfully confirmed the use of Box-Bhenken experimental design for maximum bio-recovery. Results also revealed that small and less time consuming experimental designs could be efficient for optimization of bio-recovery processes. (author)

  16. Experimental Engineering: Articulating and Valuing Design Experimentation

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Grönvall, Erik; Fritsch, Jonas

    2017-01-01

    In this paper we propose Experimental Engineering as a way to articulate open- ended technological experiments as a legitimate design research practice. Experimental Engineering introduces a move away from an outcome or result driven design process towards an interest in existing technologies and...

  17. Tribological behaviour and statistical experimental design of sintered iron-copper based composites

    Science.gov (United States)

    Popescu, Ileana Nicoleta; Ghiţă, Constantin; Bratu, Vasile; Palacios Navarro, Guillermo

    2013-11-01

    The sintered iron-copper based composites for automotive brake pads have a complex composite composition and should have good physical, mechanical and tribological characteristics. In this paper, we obtained frictional composites by Powder Metallurgy (P/M) technique and we have characterized them by microstructural and tribological point of view. The morphology of raw powders was determined by SEM and the surfaces of obtained sintered friction materials were analyzed by ESEM, EDS elemental and compo-images analyses. One lot of samples were tested on a "pin-on-disc" type wear machine under dry sliding conditions, at applied load between 3.5 and 11.5 × 10-1 MPa and 12.5 and 16.9 m/s relative speed in braking point at constant temperature. The other lot of samples were tested on an inertial test stand according to a methodology simulating the real conditions of dry friction, at a contact pressure of 2.5-3 MPa, at 300-1200 rpm. The most important characteristics required for sintered friction materials are high and stable friction coefficient during breaking and also, for high durability in service, must have: low wear, high corrosion resistance, high thermal conductivity, mechanical resistance and thermal stability at elevated temperature. Because of the tribological characteristics importance (wear rate and friction coefficient) of sintered iron-copper based composites, we predicted the tribological behaviour through statistical analysis. For the first lot of samples, the response variables Yi (represented by the wear rate and friction coefficient) have been correlated with x1 and x2 (the code value of applied load and relative speed in braking points, respectively) using a linear factorial design approach. We obtained brake friction materials with improved wear resistance characteristics and high and stable friction coefficients. It has been shown, through experimental data and obtained linear regression equations, that the sintered composites wear rate increases

  18. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  19. Statistical experimental design approach in coal briquetting

    Energy Technology Data Exchange (ETDEWEB)

    B. Salopek; S. Pfaff; R. Rajic

    2003-07-01

    The influence of pressure, temperature, humidity and granulation of the coal upon the resistance to pressure and the water absorption of the briquettes has been tested, with the aim to examine how each of the two dependent variables changes depending on the values assumed by any of the four independent variables and which of the mentioned independent variables significantly influences the dependent ones. The full factorial design with 16 experiments and the central composite design with 27 experiments have been applied. The influence of the independent variables upon the dependent ones has been examined by applying the analysis of variance. The influence values of individual factors and their interaction upon the dependent variables have been stated as well as coefficients of curvilinear equation. 2 refs., 2 figs., 5 tabs.

  20. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  1. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  2. Statistical evaluation of SAGE libraries: consequences for experimental design

    NARCIS (Netherlands)

    Ruijter, Jan M.; van Kampen, Antoine H. C.; Baas, Frank

    2002-01-01

    Since the introduction of serial analysis of gene expression (SAGE) as a method to quantitatively analyze the differential expression of genes, several statistical tests have been published for the pairwise comparison of SAGE libraries. Testing the difference between the number of specific tags

  3. Literature in Focus: Statistical Methods in Experimental Physics

    CERN Multimedia

    2007-01-01

    Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...

  4. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    Science.gov (United States)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  5. Hierarchical adaptive experimental design for Gaussian process emulators

    International Nuclear Information System (INIS)

    Busby, Daniel

    2009-01-01

    Large computer simulators have usually complex and nonlinear input output functions. This complicated input output relation can be analyzed by global sensitivity analysis; however, this usually requires massive Monte Carlo simulations. To effectively reduce the number of simulations, statistical techniques such as Gaussian process emulators can be adopted. The accuracy and reliability of these emulators strongly depend on the experimental design where suitable evaluation points are selected. In this paper a new sequential design strategy called hierarchical adaptive design is proposed to obtain an accurate emulator using the least possible number of simulations. The hierarchical design proposed in this paper is tested on various standard analytic functions and on a challenging reservoir forecasting application. Comparisons with standard one-stage designs such as maximin latin hypercube designs show that the hierarchical adaptive design produces a more accurate emulator with the same number of computer experiments. Moreover a stopping criterion is proposed that enables to perform the number of simulations necessary to obtain required approximation accuracy.

  6. Intermediate/Advanced Research Design and Statistics

    Science.gov (United States)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  7. Fast Bayesian optimal experimental design and its applications

    KAUST Repository

    Long, Quan

    2015-01-07

    We summarize our Laplace method and multilevel method of accelerating the computation of the expected information gain in a Bayesian Optimal Experimental Design (OED). Laplace method is a widely-used method to approximate an integration in statistics. We analyze this method in the context of optimal Bayesian experimental design and extend this method from the classical scenario, where a single dominant mode of the parameters can be completely-determined by the experiment, to the scenarios where a non-informative parametric manifold exists. We show that by carrying out this approximation the estimation of the expected Kullback-Leibler divergence can be significantly accelerated. While Laplace method requires a concentration of measure, multi-level Monte Carlo method can be used to tackle the problem when there is a lack of measure concentration. We show some initial results on this approach. The developed methodologies have been applied to various sensor deployment problems, e.g., impedance tomography and seismic source inversion.

  8. Optimization of Xylanase production from Penicillium sp.WX-Z1 by a two-step statistical strategy: Plackett-Burman and Box-Behnken experimental design.

    Science.gov (United States)

    Cui, Fengjie; Zhao, Liming

    2012-01-01

    The objective of the study was to optimize the nutrition sources in a culture medium for the production of xylanase from Penicillium sp.WX-Z1 using Plackett-Burman design and Box-Behnken design. The Plackett-Burman multifactorial design was first employed to screen the important nutrient sources in the medium for xylanase production by Penicillium sp.WX-Z1 and subsequent use of the response surface methodology (RSM) was further optimized for xylanase production by Box-Behnken design. The important nutrient sources in the culture medium, identified by the initial screening method of Placket-Burman, were wheat bran, yeast extract, NaNO(3), MgSO(4), and CaCl(2). The optimal amounts (in g/L) for maximum production of xylanase were: wheat bran, 32.8; yeast extract, 1.02; NaNO(3), 12.71; MgSO(4), 0.96; and CaCl(2), 1.04. Using this statistical experimental design, the xylanase production under optimal condition reached 46.50 U/mL and an increase in xylanase activity of 1.34-fold was obtained compared with the original medium for fermentation carried out in a 30-L bioreactor.

  9. Experimental Design Research

    DEFF Research Database (Denmark)

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations...... of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology......, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current...

  10. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    Science.gov (United States)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  11. Experimental, statistical, and biological models of radon carcinogenesis

    International Nuclear Information System (INIS)

    Cross, F.T.

    1991-09-01

    Risk models developed for underground miners have not been consistently validated in studies of populations exposed to indoor radon. Imprecision in risk estimates results principally from differences between exposures in mines as compared to domestic environments and from uncertainties about the interaction between cigarette-smoking and exposure to radon decay products. Uncertainties in extrapolating miner data to domestic exposures can be reduced by means of a broad-based health effects research program that addresses the interrelated issues of exposure, respiratory tract dose, carcinogenesis (molecular/cellular and animal studies, plus developing biological and statistical models), and the relationship of radon to smoking and other copollutant exposures. This article reviews experimental animal data on radon carcinogenesis observed primarily in rats at Pacific Northwest Laboratory. Recent experimental and mechanistic carcinogenesis models of exposures to radon, uranium ore dust, and cigarette smoke are presented with statistical analyses of animal data. 20 refs., 1 fig

  12. Optimizing an experimental design for an electromagnetic experiment

    Science.gov (United States)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  13. LOGICAL AND EXPERIMENTAL DESIGN FOR PHENOL DEGRADATION USING IMMOBILIZED ACINETOBACTER SP. CULTURE

    Directory of Open Access Journals (Sweden)

    Amro Abd Al Fattah Amara

    2010-05-01

    Full Text Available Phenol degradation processes were conducted through a series of enzymatic reactions effects and is affect by different components of the microbial metabolic flux. Using different optimization strategies like mutagenesis could lead to a successful optimization but also lead to lost of some important microbial features or to release a new virulence or unexpected characters. Plackett-Burman closes much gab between optimization, safety, time, cost, Man/hr, the complexity of the metabolic flux etc. Using Plackett-Burman experimental design lead to map the points affect in the optimization process by well understanding their request from nutrient and the best environmental condition required. In this study nine variables include pH (X1, oC (X2, glucose (X3, yeast extract (X4, meat extract (X5, NH4NO3 (X6, K-salt (X7, Mg-salt (X8 and trace element (X9 are optimized during phenol degradation by Acinetobacter sp., using Plackett-Burman design method. Plackett-Burman included 16 experiments, each was used in two levels, [-1] low and high [+1]. According to Blackett-Burman design experiments the maximum degradation rate was 31.25 mg/l/h. Logical and statistical analysis of the data lead to select pH, Temperature and Meat extract as three factors affecting on phenol degradation rate. These three variables have been used in Box-Behnken experimental design for further optimization. Meat extract, which is not statistically recommended for optimization has been used while it can substitute trace element, which is statistically significant. Glucose, which is statistically significant, did not included while it has a negative effect and gave the best result at 0 g/l amount. Glucose has been completely omitted from the media.  pH, temperature and meat extract were used in fifteen experiments each was used in three levels, –1, 0, and +1 according to Box-Behnken design. Microsoft Excel 2002 solver tool was used to optimize the model created from Box-Behnken. The

  14. Applying Statistical Design to Control the Risk of Over-Design with Stochastic Simulation

    Directory of Open Access Journals (Sweden)

    Yi Wu

    2010-02-01

    Full Text Available By comparing a hard real-time system and a soft real-time system, this article elicits the risk of over-design in soft real-time system designing. To deal with this risk, a novel concept of statistical design is proposed. The statistical design is the process accurately accounting for and mitigating the effects of variation in part geometry and other environmental conditions, while at the same time optimizing a target performance factor. However, statistical design can be a very difficult and complex task when using clas-sical mathematical methods. Thus, a simulation methodology to optimize the design is proposed in order to bridge the gap between real-time analysis and optimization for robust and reliable system design.

  15. Quasi-Experimental Designs for Causal Inference

    Science.gov (United States)

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  16. A statistical characterization method for damping material properties and its application to structural-acoustic system design

    International Nuclear Information System (INIS)

    Jung, Byung C.; Lee, Doo Ho; Youn, Byeng D.; Lee, Soo Bum

    2011-01-01

    The performance of surface damping treatments may vary once the surface is exposed to a wide range of temperatures, because the performance of viscoelastic damping material is highly dependent on operational temperature. In addition, experimental data for dynamic responses of viscoelastic material are inherently random, which makes it difficult to design a robust damping layout. In this paper a statistical modeling procedure with a statistical calibration method is suggested for the variability characterization of viscoelastic damping material in constrained-layer damping structures. First, the viscoelastic material property is decomposed into two sources: (I) a random complex modulus due to operational temperature variability, and (II) experimental/model errors in the complex modulus. Next, the variability in the damping material property is obtained using the statistical calibration method by solving an unconstrained optimization problem with a likelihood function metric. Two case studies are considered to show the influence of the material variability on the acoustic performances in the structural-acoustic systems. It is shown that the variability of the damping material is propagated to that of the acoustic performances in the systems. Finally, robust and reliable damping layout designs of the two case studies are obtained through the reliability-based design optimization (RBDO) amidst severe variability in operational temperature and the damping material

  17. Ice condenser experimental plan

    International Nuclear Information System (INIS)

    Kannberg, L.D.; Piepel, G.F.; Owczarski, P.C.; Liebetrau, A.M.

    1986-01-01

    An experimental plan is being developed to validate the computer code ICEDF. The code was developed to estimate the extent of aerosol retention in the ice compartments of pressurized water reactor ice condenser containment systems during severe accidents. The development of the experimental plan began with review of available information on the conditions under which the code will be applied. Computer-generated estimates of thermohydraulic and aerosol conditions entering the ice condenser were evaluated and along with other information, used to generate design criteria. The design criteria have been used for preliminary test assembly design and for generation of statistical test designs. Consideration of the phenomena to be evaluated in the testing program, as well as equipment and measurement limitations, have led to changes in the design criteria and to subsequent changes in the test assembly design and statistical test design. The overall strategy in developing the experimental plan includes iterative generation and evaluation of candidate test designs using computer codes for statistical test design and ICEDF for estimation of experimental results. Estimates of experimental variability made prior to actual testing will be verified by replicate testing at preselected design points

  18. A new method to determine the number of experimental data using statistical modeling methods

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jung-Ho; Kang, Young-Jin; Lim, O-Kaung; Noh, Yoojeong [Pusan National University, Busan (Korea, Republic of)

    2017-06-15

    For analyzing the statistical performance of physical systems, statistical characteristics of physical parameters such as material properties need to be estimated by collecting experimental data. For accurate statistical modeling, many such experiments may be required, but data are usually quite limited owing to the cost and time constraints of experiments. In this study, a new method for determining a rea- sonable number of experimental data is proposed using an area metric, after obtaining statistical models using the information on the underlying distribution, the Sequential statistical modeling (SSM) approach, and the Kernel density estimation (KDE) approach. The area metric is used as a convergence criterion to determine the necessary and sufficient number of experimental data to be acquired. The pro- posed method is validated in simulations, using different statistical modeling methods, different true models, and different convergence criteria. An example data set with 29 data describing the fatigue strength coefficient of SAE 950X is used for demonstrating the performance of the obtained statistical models that use a pre-determined number of experimental data in predicting the probability of failure for a target fatigue life.

  19. Return to Our Roots: Raising Radishes to Teach Experimental Design. Methods and Techniques.

    Science.gov (United States)

    Stallings, William M.

    1993-01-01

    Reviews research in teaching applied statistics. Concludes that students should analyze data from studies they have designed and conducted. Describes an activity in which students study germination and growth of radish seeds. Includes a table providing student instructions for both the experimental procedure and data analysis. (CFR)

  20. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  1. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    Science.gov (United States)

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  2. Experimental investigation of statistical density function of decaying radioactive sources

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1991-01-01

    The validity of the Poisson and the λ P(k) modified Poisson statistical density functions of observing k events in a short time interval is investigated experimentally in radioactive decay detection for various measuring times. The experiments to measure radioactive decay were performed with 89m Y, using a multichannel analyzer. According to the results, Poisson statistics adequately describes the counting experiment for short measuring times. (author) 13 refs.; 4 figs

  3. Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Vienna, John D.; Crum, Jarrod V.

    2015-01-01

    This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO 3 , has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO 3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO 3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer-layer glasses. The experimental

  4. Quasi experimental designs in pharmacist intervention research.

    Science.gov (United States)

    Krass, Ines

    2016-06-01

    Background In the field of pharmacist intervention research it is often difficult to conform to the rigorous requirements of the "true experimental" models, especially the requirement of randomization. When randomization is not feasible, a practice based researcher can choose from a range of "quasi-experimental designs" i.e., non-randomised and at time non controlled. Objective The aim of this article was to provide an overview of quasi-experimental designs, discuss their strengths and weaknesses and to investigate their application in pharmacist intervention research over the previous decade. Results In the literature quasi experimental studies may be classified into five broad categories: quasi-experimental design without control groups; quasi-experimental design that use control groups with no pre-test; quasi-experimental design that use control groups and pre-tests; interrupted time series and stepped wedge designs. Quasi-experimental study design has consistently featured in the evolution of pharmacist intervention research. The most commonly applied of all quasi experimental designs in the practice based research literature are the one group pre-post-test design and the non-equivalent control group design i.e., (untreated control group with dependent pre-tests and post-tests) and have been used to test the impact of pharmacist interventions in general medications management as well as in specific disease states. Conclusion Quasi experimental studies have a role to play as proof of concept, in the pilot phases of interventions when testing different intervention components, especially in complex interventions. They serve to develop an understanding of possible intervention effects: while in isolation they yield weak evidence of clinical efficacy, taken collectively, they help build a body of evidence in support of the value of pharmacist interventions across different practice settings and countries. However, when a traditional RCT is not feasible for

  5. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  6. Experimental design a chemometric approach

    CERN Document Server

    Deming, SN

    1987-01-01

    Now available in a paperback edition is a book which has been described as ``...an exceptionally lucid, easy-to-read presentation... would be an excellent addition to the collection of every analytical chemist. I recommend it with great enthusiasm.'' (Analytical Chemistry). Unlike most current textbooks, it approaches experimental design from the point of view of the experimenter, rather than that of the statistician. As the reviewer in `Analytical Chemistry' went on to say: ``Deming and Morgan should be given high praise for bringing the principles of experimental design to the level of the p

  7. Chemical-Based Formulation Design: Virtual Experimentation

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul

    This paper presents a software, the virtual Product-Process Design laboratory (virtual PPD-lab) and the virtual experimental scenarios for design/verification of consumer oriented liquid formulated products where the software can be used. For example, the software can be employed for the design......, the additives and/or their mixtures (formulations). Therefore, the experimental resources can focus on a few candidate product formulations to find the best product. The virtual PPD-lab allows various options for experimentations related to design and/or verification of the product. For example, the selection...... design, model adaptation). All of the above helps to perform virtual experiments by blending chemicals together and observing their predicted behaviour. The paper will highlight the application of the virtual PPD-lab in the design and/or verification of different consumer products (paint formulation...

  8. HAMMLAB 1999 experimental control room: design - design rationale - experiences

    International Nuclear Information System (INIS)

    Foerdestroemmen, N. T.; Meyer, B. D.; Saarni, R.

    1999-01-01

    A presentation of HAMMLAB 1999 experimental control room, and the accumulated experiences gathered in the areas of design and design rationale as well as user experiences. It is concluded that HAMMLAB 1999 experimental control room is a realistic, compact and efficient control room well suited as an Advanced NPP Control Room (ml)

  9. Entropy-Based Experimental Design for Optimal Model Discrimination in the Geosciences

    Directory of Open Access Journals (Sweden)

    Wolfgang Nowak

    2016-11-01

    Full Text Available Choosing between competing models lies at the heart of scientific work, and is a frequent motivation for experimentation. Optimal experimental design (OD methods maximize the benefit of experiments towards a specified goal. We advance and demonstrate an OD approach to maximize the information gained towards model selection. We make use of so-called model choice indicators, which are random variables with an expected value equal to Bayesian model weights. Their uncertainty can be measured with Shannon entropy. Since the experimental data are still random variables in the planning phase of an experiment, we use mutual information (the expected reduction in Shannon entropy to quantify the information gained from a proposed experimental design. For implementation, we use the Preposterior Data Impact Assessor framework (PreDIA, because it is free of the lower-order approximations of mutual information often found in the geosciences. In comparison to other studies in statistics, our framework is not restricted to sequential design or to discrete-valued data, and it can handle measurement errors. As an application example, we optimize an experiment about the transport of contaminants in clay, featuring the problem of choosing between competing isotherms to describe sorption. We compare the results of optimizing towards maximum model discrimination with an alternative OD approach that minimizes the overall predictive uncertainty under model choice uncertainty.

  10. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  11. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  12. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  13. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    Science.gov (United States)

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  14. Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crum, Jarrod V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-24

    This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO3, has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer

  15. Insights in Experimental Data : Interactive Statistics with the ILLMO Program

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2017-01-01

    Empirical researchers turn to statistics to assist them in drawing conclusions, also called inferences, from their collected data. Often, this data is experimental data, i.e., it consists of (repeated) measurements collected in one or more distinct conditions. The observed data can hence be

  16. The role of experimental typography in designing logotypes

    OpenAIRE

    Pogačnik, Tadeja

    2014-01-01

    Designing logotypes is an important part of graphic design. Great logotypes are designed using custom made typefaces. Therefore, it is very important, especially for the typographic designer, to have practical experience and be up to date with all trends in the field of experimental typefaces design, also called experimental typography. In my thesis statement, I carefully examined the problems of experimental typography - which allows more creative and free typography designing for different ...

  17. Surface laser marking optimization using an experimental design approach

    Science.gov (United States)

    Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.

    2017-04-01

    Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.

  18. Design and Statistics in Quantitative Translation (Process) Research

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

  19. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  20. Chemicals-Based Formulation Design: Virtual Experimentations

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul

    2011-01-01

    This paper presents a systematic procedure for virtual experimentations related to the design of liquid formulated products. All the experiments that need to be performed when designing a liquid formulated product (lotion), such as ingredients selection and testing, solubility tests, property mea...... on the design of an insect repellent lotion will show that the software is an essential instrument in decision making, and that it reduces time and resources since experimental efforts can be focused on one or few product alternatives....

  1. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  2. Considering RNAi experimental design in parasitic helminths.

    Science.gov (United States)

    Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G

    2012-04-01

    Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.

  3. Experimental Design: Review and Comment.

    Science.gov (United States)

    1984-02-01

    creativity. Innovative modifications and extensions of classical experimental designs were developed and many useful articles were published in a short...Pjrazolone Industrielle ," Bulletin de la Soci~t6 Chimique de France, 11-12, 1171-1174. LI, K. C. (1983), "Minimaxity for Randomized Designs: Some

  4. Experimental design and estimation of growth rate distributions in size-structured shrimp populations

    International Nuclear Information System (INIS)

    Banks, H T; Davis, Jimena L; Ernstberger, Stacey L; Hu, Shuhua; Artimovich, Elena; Dhar, Arun K

    2009-01-01

    We discuss inverse problem results for problems involving the estimation of probability distributions using aggregate data for growth in populations. We begin with a mathematical model describing variability in the early growth process of size-structured shrimp populations and discuss a computational methodology for the design of experiments to validate the model and estimate the growth-rate distributions in shrimp populations. Parameter-estimation findings using experimental data from experiments so designed for shrimp populations cultivated at Advanced BioNutrition Corporation are presented, illustrating the usefulness of mathematical and statistical modeling in understanding the uncertainty in the growth dynamics of such populations

  5. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  6. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    Science.gov (United States)

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    In the last decade or so, interest in adaptive design clinical trials has gradually been directed towards their use in regulatory submissions by pharmaceutical drug sponsors to evaluate investigational new drugs. Methodological advances of adaptive designs are abundant in the statistical literature since the 1970s. The adaptive design paradigm has been enthusiastically perceived to increase the efficiency and to be more cost-effective than the fixed design paradigm for drug development. Much interest in adaptive designs is in those studies with two-stages, where stage 1 is exploratory and stage 2 depends upon stage 1 results, but where the data of both stages will be combined to yield statistical evidence for use as that of a pivotal registration trial. It was not until the recent release of the US Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics (2010) that the boundaries of flexibility for adaptive designs were specifically considered for regulatory purposes, including what are exploratory goals, and what are the goals of adequate and well-controlled (A&WC) trials (2002). The guidance carefully described these distinctions in an attempt to minimize the confusion between the goals of preliminary learning phases of drug development, which are inherently substantially uncertain, and the definitive inference-based phases of drug development. In this paper, in addition to discussing some aspects of adaptive designs in a confirmatory study setting, we underscore the value of adaptive designs when used in exploratory trials to improve planning of subsequent A&WC trials. One type of adaptation that is receiving attention is the re-estimation of the sample size during the course of the trial. We refer to this type of adaptation as an adaptive statistical information design. Specifically, a case example is used to illustrate how challenging it is to plan a confirmatory adaptive statistical information

  7. Application of Statistical Design for the Production of Cellulase by Trichoderma reesei Using Mango Peel

    Directory of Open Access Journals (Sweden)

    P. Saravanan

    2012-01-01

    Full Text Available Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH2PO4, and CoCl2·6H2O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM. The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH2PO4: 4.90 g/L, and CoCl2·6H2O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL.

  8. Fuel rod design by statistical methods for MOX fuel

    International Nuclear Information System (INIS)

    Heins, L.; Landskron, H.

    2000-01-01

    Statistical methods in fuel rod design have received more and more attention during the last years. One of different possible ways to use statistical methods in fuel rod design can be described as follows: Monte Carlo calculations are performed using the fuel rod code CARO. For each run with CARO, the set of input data is modified: parameters describing the design of the fuel rod (geometrical data, density etc.) and modeling parameters are randomly selected according to their individual distributions. Power histories are varied systematically in a way that each power history of the relevant core management calculation is represented in the Monte Carlo calculations with equal frequency. The frequency distributions of the results as rod internal pressure and cladding strain which are generated by the Monte Carlo calculation are evaluated and compared with the design criteria. Up to now, this methodology has been applied to licensing calculations for PWRs and BWRs, UO 2 and MOX fuel, in 3 countries. Especially for the insertion of MOX fuel resulting in power histories with relatively high linear heat generation rates at higher burnup, the statistical methodology is an appropriate approach to demonstrate the compliance of licensing requirements. (author)

  9. Optimal Bayesian Experimental Design for Combustion Kinetics

    KAUST Repository

    Huan, Xun

    2011-01-04

    Experimental diagnostics play an essential role in the development and refinement of chemical kinetic models, whether for the combustion of common complex hydrocarbons or of emerging alternative fuels. Questions of experimental design—e.g., which variables or species to interrogate, at what resolution and under what conditions—are extremely important in this context, particularly when experimental resources are limited. This paper attempts to answer such questions in a rigorous and systematic way. We propose a Bayesian framework for optimal experimental design with nonlinear simulation-based models. While the framework is broadly applicable, we use it to infer rate parameters in a combustion system with detailed kinetics. The framework introduces a utility function that reflects the expected information gain from a particular experiment. Straightforward evaluation (and maximization) of this utility function requires Monte Carlo sampling, which is infeasible with computationally intensive models. Instead, we construct a polynomial surrogate for the dependence of experimental observables on model parameters and design conditions, with the help of dimension-adaptive sparse quadrature. Results demonstrate the efficiency and accuracy of the surrogate, as well as the considerable effectiveness of the experimental design framework in choosing informative experimental conditions.

  10. Scalable Algorithms for Adaptive Statistical Designs

    Directory of Open Access Journals (Sweden)

    Robert Oehmke

    2000-01-01

    Full Text Available We present a scalable, high-performance solution to multidimensional recurrences that arise in adaptive statistical designs. Adaptive designs are an important class of learning algorithms for a stochastic environment, and we focus on the problem of optimally assigning patients to treatments in clinical trials. While adaptive designs have significant ethical and cost advantages, they are rarely utilized because of the complexity of optimizing and analyzing them. Computational challenges include massive memory requirements, few calculations per memory access, and multiply-nested loops with dynamic indices. We analyze the effects of various parallelization options, and while standard approaches do not work well, with effort an efficient, highly scalable program can be developed. This allows us to solve problems thousands of times more complex than those solved previously, which helps make adaptive designs practical. Further, our work applies to many other problems involving neighbor recurrences, such as generalized string matching.

  11. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  12. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  13. Organic biowastes blend selection for composting industrial eggshell by-product: experimental and statistical mixture design.

    Science.gov (United States)

    Soares, Micaela A R; Andrade, Sandra R; Martins, Rui C; Quina, Margarida J; Quinta-Ferreira, Rosa M

    2012-01-01

    Composting is one of the technologies recommended for pre-treating industrial eggshells (ES) before its application in soils, for calcium recycling. However, due to the high inorganic content of ES, a mixture of biodegradable materials is required to assure a successful procedure. In this study, an adequate organic blend composition containing potato peel (PP), grass clippings (GC) and wheat straw (WS) was determined by applying the simplex-centroid mixture design method to achieve a desired moisture content, carbon: nitrogen ratio and free air space for effective composting of ES. A blend of 56% PP, 37% GC and 7% WS was selected and tested in a self heating reactor, where 10% (w/w) of ES was incorporated. After 29 days of reactor operation, a dry matter reduction of 46% was achieved and thermophilic temperatures were maintained during 15 days, indicating that the blend selected by statistical approach was adequate for composting of ES.

  14. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  15. Removal of thorium(IV) from aqueous solution by biosorption onto modified powdered waste sludge. Experimental design approach

    International Nuclear Information System (INIS)

    Yunus Pamukoglu, M.; Mustafa Senyurt; Bulent Kirkan

    2017-01-01

    The biosorption of radioactive Th(IV) ions in the aqueous solutions onto the modified powdered waste sludge (MPWS) has been examined. In this context, the parameters affecting biosorption of Th(IV) from aqueous solutions has been examined by using MPWS biosorbent in Box Behnken statistical experimental design. The structure of MPWS biosorbent was characterized by using SEM and BET techniques. According to the experimental design results, MPWS and Th(IV) concentrations should be kept high to achieve the maximum efficiency in Th(IV) biosorption. On the other hand, MPWS, which is also used as a biosorbent, is an economical, effective and natural biosorbent. (author)

  16. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective

  17. The Reliability of Single Subject Statistics for Biofeedback Studies.

    Science.gov (United States)

    Bremner, Frederick J.; And Others

    To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…

  18. CFD simulation of CO_2 sorption on K_2CO_3 solid sorbent in novel high flux circulating-turbulent fluidized bed riser: Parametric statistical experimental design study

    International Nuclear Information System (INIS)

    Thummakul, Theeranan; Gidaspow, Dimitri; Piumsomboon, Pornpote; Chalermsinsuwan, Benjapon

    2017-01-01

    Highlights: • Circulating-turbulent fluidization was proved to be advantage on CO_2 sorption. • The novel regime was proven to capture CO_2 higher than the conventional regimes. • Uniform solid particle distribution was observed in the novel fluidization regime. • The system continuity had more effect in the system than the process system mixing. • Parametric experimental design analysis was studied to evaluate significant factor. - Abstract: In this study a high flux circulating-turbulent fluidized bed (CTFB) riser was confirmed to be advantageous for carbon dioxide (CO_2) sorption on a potassium carbonate solid sorbent. The effect of various parameters on the CO_2 removal level was evaluated using a statistical experimental design. The most appropriate fluidization regime was found to occur between the turbulent and fast fluidization regimes, which was shown to capture CO_2 more efficiently than conventional fluidization regimes. The highest CO_2 sorption level was 93.4% under optimized CTFB operating conditions. The important parameters for CO_2 capture were the inlet gas velocity and the interactions between the CO_2 concentration and the inlet gas velocity and water vapor concentration. The CTFB regime had a high and uniform solid particle distribution in both the axial and radial system directions and could transport the solid sorbent to the regeneration reactor. In addition, the process system continuity had a stronger effect on the CO_2 removal level in the system than the process system mixing.

  19. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    Science.gov (United States)

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  20. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    Science.gov (United States)

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  1. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a

  2. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Directory of Open Access Journals (Sweden)

    Patrick Wessa

    Full Text Available BACKGROUND: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses, which required us to develop a specific-purpose Statistical Learning Environment (SLE based on Reproducible Computing and newly developed Peer Review (PR technology. OBJECTIVES: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. METHODS: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. RESULTS: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student

  3. Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under

  4. Statistical inference for extended or shortened phase II studies based on Simon's two-stage designs.

    Science.gov (United States)

    Zhao, Junjun; Yu, Menggang; Feng, Xi-Ping

    2015-06-07

    Simon's two-stage designs are popular choices for conducting phase II clinical trials, especially in the oncology trials to reduce the number of patients placed on ineffective experimental therapies. Recently Koyama and Chen (2008) discussed how to conduct proper inference for such studies because they found that inference procedures used with Simon's designs almost always ignore the actual sampling plan used. In particular, they proposed an inference method for studies when the actual second stage sample sizes differ from planned ones. We consider an alternative inference method based on likelihood ratio. In particular, we order permissible sample paths under Simon's two-stage designs using their corresponding conditional likelihood. In this way, we can calculate p-values using the common definition: the probability of obtaining a test statistic value at least as extreme as that observed under the null hypothesis. In addition to providing inference for a couple of scenarios where Koyama and Chen's method can be difficult to apply, the resulting estimate based on our method appears to have certain advantage in terms of inference properties in many numerical simulations. It generally led to smaller biases and narrower confidence intervals while maintaining similar coverages. We also illustrated the two methods in a real data setting. Inference procedures used with Simon's designs almost always ignore the actual sampling plan. Reported P-values, point estimates and confidence intervals for the response rate are not usually adjusted for the design's adaptiveness. Proper statistical inference procedures should be used.

  5. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  6. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    Science.gov (United States)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  7. Statistical processing of experimental data

    OpenAIRE

    NAVRÁTIL, Pavel

    2012-01-01

    This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

  8. Experimental design research approaches, perspectives, applications

    CERN Document Server

    Stanković, Tino; Štorga, Mario

    2016-01-01

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  9. Fast Bayesian optimal experimental design for seismic source inversion

    KAUST Repository

    Long, Quan

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem. © 2015 Elsevier B.V.

  10. Fast Bayesian Optimal Experimental Design for Seismic Source Inversion

    KAUST Repository

    Long, Quan

    2016-01-06

    We develop a fast method for optimally designing experiments [1] in the context of statistical seismic source inversion [2]. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by the elastic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the true parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  11. Fast Bayesian Optimal Experimental Design for Seismic Source Inversion

    KAUST Repository

    Long, Quan; Motamed, Mohammad; Tempone, Raul

    2016-01-01

    We develop a fast method for optimally designing experiments [1] in the context of statistical seismic source inversion [2]. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by the elastic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the true parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  12. The reactor safety study of experimental multi-purpose VHTR design

    International Nuclear Information System (INIS)

    Yasuno, T.; Mitake, S.; Ezaki, M.; Suzuki, K.

    1981-01-01

    Over the past years, the design works of the Experimental Very High Temperature Reactor (VHTR) plant have been conducted at Japan Atomic Energy Research Institute. The conceptual design has been completed and the more detailed design works and the safety analysis of the experimental VHTR plant are continued. The purposes of design studies are to show the feasibility of the experimental VHTR program, to specify the characteristics and functions of the plant components, to point out the R and D items necessary for the experimental VHTR plant construction, and to analyze the feature of the plant safety. In this paper the summary of system design and safety features of the experimental reactor are indicated. Main issues are the safety philosophy for the design basis accident, the accidents assumed and the engineered safety systems adopted in the design works

  13. Application of machine/statistical learning, artificial intelligence and statistical experimental design for the modeling and optimization of methylene blue and Cd(ii) removal from a binary aqueous solution by natural walnut carbon.

    Science.gov (United States)

    Mazaheri, H; Ghaedi, M; Ahmadi Azqhandi, M H; Asfaram, A

    2017-05-10

    Analytical chemists apply statistical methods for both the validation and prediction of proposed models. Methods are required that are adequate for finding the typical features of a dataset, such as nonlinearities and interactions. Boosted regression trees (BRTs), as an ensemble technique, are fundamentally different to other conventional techniques, with the aim to fit a single parsimonious model. In this work, BRT, artificial neural network (ANN) and response surface methodology (RSM) models have been used for the optimization and/or modeling of the stirring time (min), pH, adsorbent mass (mg) and concentrations of MB and Cd 2+ ions (mg L -1 ) in order to develop respective predictive equations for simulation of the efficiency of MB and Cd 2+ adsorption based on the experimental data set. Activated carbon, as an adsorbent, was synthesized from walnut wood waste which is abundant, non-toxic, cheap and locally available. This adsorbent was characterized using different techniques such as FT-IR, BET, SEM, point of zero charge (pH pzc ) and also the determination of oxygen containing functional groups. The influence of various parameters (i.e. pH, stirring time, adsorbent mass and concentrations of MB and Cd 2+ ions) on the percentage removal was calculated by investigation of sensitive function, variable importance rankings (BRT) and analysis of variance (RSM). Furthermore, a central composite design (CCD) combined with a desirability function approach (DFA) as a global optimization technique was used for the simultaneous optimization of the effective parameters. The applicability of the BRT, ANN and RSM models for the description of experimental data was examined using four statistical criteria (absolute average deviation (AAD), mean absolute error (MAE), root mean square error (RMSE) and coefficient of determination (R 2 )). All three models demonstrated good predictions in this study. The BRT model was more precise compared to the other models and this showed

  14. Statistical optimization of the growth factors for Chaetoceros neogracile using fractional factorial design and central composite design.

    Science.gov (United States)

    Jeong, Sung-Eun; Park, Jae-Kweon; Kim, Jeong-Dong; Chang, In-Jeong; Hong, Seong-Joo; Kang, Sung-Ho; Lee, Choul-Gyun

    2008-12-01

    Statistical experimental designs; involving (i) a fractional factorial design (FFD) and (ii) a central composite design (CCD) were applied to optimize the culture medium constituents for production of a unique antifreeze protein by the Antartic microalgae Chaetoceros neogracile. The results of the FFD suggested that NaCl, KCl, MgCl2, and Na2SiO3 were significant variables that highly influenced the growth rate and biomass production. The optimum culture medium for the production of an antifreeze protein from C. neogracile was found to be Kalleampersandrsquor;s artificial seawater, pH of 7.0ampersandplusmn;0.5, consisting of 28.566 g/l of NaCl, 3.887 g/l of MgCl2, 1.787 g/l of MgSO4, 1.308 g/l of CaSO4, 0.832 g/l of K2SO4, 0.124 g/l of CaCO3, 0.103 g/l of KBr, 0.0288 g/l of SrSO4, and 0.0282 g/l of H3BO3. The antifreeze activity significantly increased after cells were treated with cold shock (at -5oC) for 14 h. To the best of our knowledge, this is the first report demonstrating an antifreeze-like protein of C. neogracile.

  15. Application of Plackett-Burman experimental design in the development of muffin using adlay flour

    Science.gov (United States)

    Valmorida, J. S.; Castillo-Israel, K. A. T.

    2018-01-01

    The application of Plackett-Burman experimental design was made to identify significant formulation and process variables in the development of muffin using adlay flour. Out of the seven screened variables, levels of sugar, levels of butter and baking temperature had the most significant influence on the product model in terms of physicochemical and sensory acceptability. Results of the experiment further demonstrate the effectiveness of Plackett-Burman design in choosing the best adlay variety for muffin production. Hence, the statistical method used in the study permits an efficient selection of important variables needed in the development of muffin from adlay which can be optimized using response surface methodology.

  16. Statistical methods in the mechanical design of fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)

  17. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  18. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  19. Statistically designed experiments to screen chemical mixtures for possible interactions

    NARCIS (Netherlands)

    Groten, J.P.; Tajima, O.; Feron, V.J.; Schoen, E.D.

    1998-01-01

    For the accurate analysis of possible interactive effects of chemicals in a defined mixture, statistical designs are necessary to develop clear and manageable experiments. For instance, factorial designs have been successfully used to detect two-factor interactions. Particularly useful for this

  20. Statistical analysis on experimental calibration data for flowmeters in pressure pipes

    Science.gov (United States)

    Lazzarin, Alessandro; Orsi, Enrico; Sanfilippo, Umberto

    2017-08-01

    This paper shows a statistical analysis on experimental calibration data for flowmeters (i.e.: electromagnetic, ultrasonic, turbine flowmeters) in pressure pipes. The experimental calibration data set consists of the whole archive of the calibration tests carried out on 246 flowmeters from January 2001 to October 2015 at Settore Portate of Laboratorio di Idraulica “G. Fantoli” of Politecnico di Milano, that is accredited as LAT 104 for a flow range between 3 l/s and 80 l/s, with a certified Calibration and Measurement Capability (CMC) - formerly known as Best Measurement Capability (BMC) - equal to 0.2%. The data set is split into three subsets, respectively consisting in: 94 electromagnetic, 83 ultrasonic and 69 turbine flowmeters; each subset is analysed separately from the others, but then a final comparison is carried out. In particular, the main focus of the statistical analysis is the correction C, that is the difference between the flow rate Q measured by the calibration facility (through the accredited procedures and the certified reference specimen) minus the flow rate QM contemporarily recorded by the flowmeter under calibration, expressed as a percentage of the same QM .

  1. Development of a fast, lean and agile direct pelletization process using experimental design techniques.

    Science.gov (United States)

    Politis, Stavros N; Rekkas, Dimitrios M

    2017-04-01

    A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.

  2. Electrochemical production and use of free chlorine for pollutant removal: an experimental design approach.

    Science.gov (United States)

    Antonelli, Raissa; de Araújo, Karla Santos; Pires, Ricardo Francisco; Fornazari, Ana Luiza de Toledo; Granato, Ana Claudia; Malpass, Geoffroy Roger Pointer

    2017-10-28

    The present paper presents the study of (1) the optimization of electrochemical-free chlorine production using an experimental design approach, and (2) the application of the optimum conditions obtained for the application in photo-assisted electrochemical degradation of simulated textile effluent. In the experimental design the influence of inter-electrode gap, pH, NaCl concentration and current was considered. It was observed that the four variables studied are significant for the process, with NaCl concentration and current being the most significant variables for free chlorine production. The maximum free chlorine production was obtained at a current of 2.33 A and NaCl concentrations in 0.96 mol dm -3 . The application of the optimized conditions with simultaneous UV irradiation resulted in up to 83.1% Total Organic Carbon removal and 100% of colour removal over 180 min of electrolysis. The results indicate that a systematic (statistical) approach to the electrochemical treatment of pollutants can save time and reagents.

  3. Experimental investigation of statistical models describing distribution of counts

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1992-01-01

    The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)

  4. Four Papers on Contemporary Software Design Strategies for Statistical Methodologists

    OpenAIRE

    Carey, Vincent; Cook, Dianne

    2014-01-01

    Software design impacts much of statistical analysis and, as technology changes, dramatically so in recent years, it is exciting to learn how statistical software is adapting and changing. This leads to the collection of papers published here, written by John Chambers, Duncan Temple Lang, Michael Lawrence, Martin Morgan, Yihui Xie, Heike Hofmann and Xiaoyue Cheng.

  5. Fast Synthesis of Gibbsite Nanoplates and Process Optimization using Box-Behnken Experimental Design

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xin; Zhang, Xianwen; Graham, Trenton R.; Pearce, Carolyn I.; Mehdi, Beata L.; N' Diaye, Alpha T.; Kerisit, Sebastien N.; Browning, Nigel D.; Clark, Sue B.; Rosso, Kevin M.

    2017-10-26

    Developing the ability to synthesize compositionally and morphologically well-defined gibbsite particles at the nanoscale with high yield is an ongoing need that has not yet achieved the level of rational design. Here we report optimization of a clean inorganic synthesis route based on statistical experimental design examining the influence of Al(OH)3 gel precursor concentration, pH, and aging time at temperature. At 80 oC, the optimum synthesis conditions of gel concentration at 0.5 M, pH at 9.2, and time at 72 h maximized the reaction yield up to ~87%. The resulting gibbsite product is composed of highly uniform euhedral hexagonal nanoplates within a basal plane diameter range of 200-400 nm. The independent roles of key system variables in the growth mechanism are considered. On the basis of these optimized experimental conditions, the synthesis procedure, which is both cost-effective and environmentally friendly, has the potential for mass production scale-up of high quality gibbsite material for various fundamental research and industrial applications.

  6. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  7. On the construction of experimental designs for a given task by jointly optimizing several quality criteria: Pareto-optimal experimental designs.

    Science.gov (United States)

    Sánchez, M S; Sarabia, L A; Ortiz, M C

    2012-11-19

    Experimental designs for a given task should be selected on the base of the problem being solved and of some criteria that measure their quality. There are several such criteria because there are several aspects to be taken into account when making a choice. The most used criteria are probably the so-called alphabetical optimality criteria (for example, the A-, E-, and D-criteria related to the joint estimation of the coefficients, or the I- and G-criteria related to the prediction variance). Selecting a proper design to solve a problem implies finding a balance among these several criteria that measure the performance of the design in different aspects. Technically this is a problem of multi-criteria optimization, which can be tackled from different views. The approach presented here addresses the problem in its real vector nature, so that ad hoc experimental designs are generated with an algorithm based on evolutionary algorithms to find the Pareto-optimal front. There is not theoretical limit to the number of criteria that can be studied and, contrary to other approaches, no just one experimental design is computed but a set of experimental designs all of them with the property of being Pareto-optimal in the criteria needed by the user. Besides, the use of an evolutionary algorithm makes it possible to search in both continuous and discrete domains and avoid the need of having a set of candidate points, usual in exchange algorithms. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Statistical evaluation of design-error related accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1980-01-01

    In a recently published paper (Campbell and Ott, 1979), a general methodology was proposed for the statistical evaluation of design-error related accidents. The evaluation aims at an estimate of the combined residual frequency of yet unknown types of accidents lurking in a certain technological system. Here, the original methodology is extended, as to apply to a variety of systems that evolves during the development of large-scale technologies. A special categorization of incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of the nuclear power reactor technology, considering serious accidents that involve in the accident-progression a particular design inadequacy

  9. Research design and statistical methods in Indian medical journals: a retrospective survey.

    Science.gov (United States)

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, pdesign decreased significantly (χ2=16.783, Φ=0.12 pdesigns has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, presearch seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of

  10. Statistical analysis of correlated experimental data and neutron cross section evaluation

    International Nuclear Information System (INIS)

    Badikov, S.A.

    1998-01-01

    The technique for evaluation of neutron cross sections on the basis of statistical analysis of correlated experimental data is presented. The most important stages of evaluation beginning from compilation of correlation matrix for measurement uncertainties till representation of the analysis results in the ENDF-6 format are described in details. Special attention is paid to restrictions (positive uncertainty) on covariation matrix of approximate parameters uncertainties generated within the method of least square fit which is derived from physical reasons. The requirements for source experimental data assuring satisfaction of the restrictions mentioned above are formulated. Correlation matrices of measurement uncertainties in particular should be also positively determined. Variants of modelling the positively determined correlation matrices of measurement uncertainties in a situation when their consequent calculation on the basis of experimental information is impossible are discussed. The technique described is used for creating the new generation of estimates of dosimetric reactions cross sections for the first version of the Russian dosimetric file (including nontrivial covariation information)

  11. WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics

    Energy Technology Data Exchange (ETDEWEB)

    Labby, Z. [University of Wisconsin (United States)

    2015-06-15

    Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.

  12. WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics

    International Nuclear Information System (INIS)

    Labby, Z.

    2015-01-01

    Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling

  13. Return to Our Roots: Raising Radishes To Teach Experimental Design.

    Science.gov (United States)

    Stallings, William M.

    To provide practice in making design decisions, collecting and analyzing data, and writing and documenting results, a professor of statistics has his graduate students in statistics and research methodology classes design and perform an experiment on the effects of fertilizers on the growth of radishes. This project has been required of students…

  14. Electrodialytic desalination of brackish water: determination of optimal experimental parameters using full factorial design

    Science.gov (United States)

    Gmar, Soumaya; Helali, Nawel; Boubakri, Ali; Sayadi, Ilhem Ben Salah; Tlili, Mohamed; Amor, Mohamed Ben

    2017-12-01

    The aim of this work is to study the desalination of brackish water by electrodialysis (ED). A two level-three factor (23) full factorial design methodology was used to investigate the influence of different physicochemical parameters on the demineralization rate (DR) and the specific power consumption (SPC). Statistical design determines factors which have the important effects on ED performance and studies all interactions between the considered parameters. Three significant factors were used including applied potential, salt concentration and flow rate. The experimental results and statistical analysis show that applied potential and salt concentration are the main effect for DR as well as for SPC. The effect of interaction between applied potential and salt concentration was observed for SPC. A maximum value of 82.24% was obtained for DR under optimum conditions and the best value of SPC obtained was 5.64 Wh L-1. Empirical regression models were also obtained and used to predict the DR and the SPC profiles with satisfactory results. The process was applied for the treatment of real brackish water using the optimal parameters.

  15. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui

    2010-09-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  16. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui; Xie, Huizhi; Wu, Wenzhuo; Roshan Joseph, V.; Jeff Wu, C. F.; Wang, Zhong Lin

    2010-01-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  17. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship-Quasi-Experimental Designs.

    Science.gov (United States)

    Schweizer, Marin L; Braun, Barbara I; Milstone, Aaron M

    2016-10-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt, nonrandomized interventions. Quasi-experimental studies can be categorized into 3 major types: interrupted time-series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship, including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. Infect Control Hosp Epidemiol 2016;1-6.

  18. Statistical modeling of static strengths of nuclear graphites with relevance to structural design

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-02-01

    Use of graphite materials for structural members poses a problem as to how to take into account of statistical properties of static strength, especially tensile fracture stresses, in component structural design. The present study concerns comprehensive examinations on statistical data base and modelings on nuclear graphites. First, the report provides individual samples and their analyses on strengths of IG-110 and PGX graphites for HTTR components. Those statistical characteristics on other HTGR graphites are also exemplified from the literature. Most of statistical distributions of individual samples are found to be approximately normal. The goodness of fit to normal distributions is more satisfactory with larger sample sizes. Molded and extruded graphites, however, possess a variety of statistical properties depending of samples from different with-in-log locations and/or different orientations. Second, the previous statistical models including the Weibull theory are assessed from the viewpoint of applicability to design procedures. This leads to a conclusion that the Weibull theory and its modified ones are satisfactory only for limited parts of tensile fracture behavior. They are not consistent for whole observations. Only normal statistics are justifiable as practical approaches to discuss specified minimum ultimate strengths as statistical confidence limits for individual samples. Third, the assessment of various statistical models emphasizes the need to develop advanced analytical ones which should involve modeling of microstructural features of actual graphite materials. Improvements of other structural design methodologies are also presented. (author)

  19. Robust Bayesian Experimental Design for Conceptual Model Discrimination

    Science.gov (United States)

    Pham, H. V.; Tsai, F. T. C.

    2015-12-01

    A robust Bayesian optimal experimental design under uncertainty is presented to provide firm information for model discrimination, given the least number of pumping wells and observation wells. Firm information is the maximum information of a system can be guaranteed from an experimental design. The design is based on the Box-Hill expected entropy decrease (EED) before and after the experiment design and the Bayesian model averaging (BMA) framework. A max-min programming is introduced to choose the robust design that maximizes the minimal Box-Hill EED subject to that the highest expected posterior model probability satisfies a desired probability threshold. The EED is calculated by the Gauss-Hermite quadrature. The BMA method is used to predict future observations and to quantify future observation uncertainty arising from conceptual and parametric uncertainties in calculating EED. Monte Carlo approach is adopted to quantify the uncertainty in the posterior model probabilities. The optimal experimental design is tested by a synthetic 5-layer anisotropic confined aquifer. Nine conceptual groundwater models are constructed due to uncertain geological architecture and boundary condition. High-performance computing is used to enumerate all possible design solutions in order to identify the most plausible groundwater model. Results highlight the impacts of scedasticity in future observation data as well as uncertainty sources on potential pumping and observation locations.

  20. Design and analysis of experiments with SAS

    CERN Document Server

    Lawson, John

    2010-01-01

    IntroductionStatistics and Data Collection Beginnings of Statistically Planned Experiments Definitions and Preliminaries Purposes of Experimental Design Types of Experimental Designs Planning Experiments Performing the Experiments Use of SAS SoftwareCompletely Randomized Designs with One Factor Introduction Replication and Randomization A Historical Example Linear Model for Completely Randomized Design (CRD) Verifying Assumptions of the Linear Model Analysis Strategies When Assumptions Are Violated Determining the Number of Replicates Comparison of Treatments after the F-TestFactorial Designs

  1. Statistical mixture design selective extraction of compounds with antioxidant activity and total polyphenol content from Trichilia catigua.

    Science.gov (United States)

    Lonni, Audrey Alesandra Stinghen Garcia; Longhini, Renata; Lopes, Gisely Cristiny; de Mello, João Carlos Palazzo; Scarminio, Ieda Spacino

    2012-03-16

    Statistical design mixtures of water, methanol, acetone and ethanol were used to extract material from Trichilia catigua (Meliaceae) barks to study the effects of different solvents and their mixtures on its yield, total polyphenol content and antioxidant activity. The experimental results and their response surface models showed that quaternary mixtures with approximately equal proportions of all four solvents provided the highest yields, total polyphenol contents and antioxidant activities of the crude extracts followed by ternary design mixtures. Principal component and hierarchical clustering analysis of the HPLC-DAD spectra of the chromatographic peaks of 1:1:1:1 water-methanol-acetone-ethanol mixture extracts indicate the presence of cinchonains, gallic acid derivatives, natural polyphenols, flavanoids, catechins, and epicatechins. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Revised design for the Tokamak experimental power reactor

    International Nuclear Information System (INIS)

    Stacey, W.M. Jr.; Abdou, M.A.; Brooks, J.N.

    1977-03-01

    A new, preliminary design has been identified for the tokamak experimental power reactor (EPR). The revised EPR design is simpler, more compact, less expensive and has somewhat better performance characteristics than the previous design, yet retains many of the previously developed design concepts. This report summarizes the principle features of the new EPR design, including performance and cost

  3. Analyzing Data from a Pretest-Posttest Control Group Design: The Importance of Statistical Assumptions

    Science.gov (United States)

    Zientek, Linda; Nimon, Kim; Hammack-Brown, Bryn

    2016-01-01

    Purpose: Among the gold standards in human resource development (HRD) research are studies that test theoretically developed hypotheses and use experimental designs. A somewhat typical experimental design would involve collecting pretest and posttest data on individuals assigned to a control or experimental group. Data from such a design that…

  4. Development of the Biological Experimental Design Concept Inventory (BEDCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gulnur

    2014-01-01

    Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non-expert-like thinking in students and to evaluate the…

  5. Design preferences and cognitive styles: experimentation by automated website synthesis.

    Science.gov (United States)

    Leung, Siu-Wai; Lee, John; Johnson, Chris; Robertson, David

    2012-06-29

    This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain.

  6. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  7. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  8. Conceptual design of helium experimental loop

    International Nuclear Information System (INIS)

    Yu Xingfu; Feng Kaiming

    2007-01-01

    In a future demonstration fusion power station (DEMO), helium is envisaged as coolant for plasma facing components, such as blanket and dive,or. All these components have a very complex geometry, with many parallel cooling channels, involving a complex helium flow distribution. Test blanket modules (TBM) of this concept will under go various tests in the experimental reactor ITER. For the qualification of TBM, it is indispensable to test mock-ups in a helium loop under realistic pressure and temperature profiles, in order to validate design codes, especially regarding mass flow and heat transition processes in narrow cooling channels. Similar testing must be performed for DEMO blanket, currently under development. A Helium Experimental Loop (HELOOP) is planed to be built for TBM tests. The design parameter of temperature, pressure, flow rate is 550 degree C, 10 MPa, l kg/s respectively. In particular, HELOOP is able to: perform full-scale tests of TBM under realistic conditions; test other components of the He-cooling system in ITER; qualify the purification circuit; obtain information for the design of the ITER cooling system. The main requirements and characteristics of the HELOOP facility and a preliminary conceptual design are described in the paper. (authors)

  9. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis.

    Science.gov (United States)

    Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.

  10. Neural Network Assisted Experimental Designs for Food Research

    Directory of Open Access Journals (Sweden)

    H.S. Ramaswamy

    2000-06-01

    Full Text Available The ability of artificial neural networks (ANN in predicting full factorial data from the fractional data corresponding to some of the commonly used experimental designs is explored in this paper. Factorial and fractional factorial designs such as L8, L9, L18, and Box and Behnken schemes were considered both in their original form and with some variations (L8+6, L15 and L9+1. Full factorial (3 factors x 5 levels and fractional data were generated employing sixteen different mathematical equations (four in each category: linear, with and without interactions, and non-linear, with and without interactions. Different ANN models were trained and the best model was chosen for each equation based on their ability to predict the fractional data. The best experimental design was then chosen based on their ability to simulate the full- factorial data for each equation. In several cases, the mean relative errors with the L18 design (which had more input data than other models were even higher than with other smaller fractional design. In general, the ANN assisted Lm, Box and Behnken, L15 and L18 designs were found to predict the full factorial data reasonably well with errors less than 5 %. The L8+6 model performed well with several experimental datasets reported in the literature.

  11. Experimental statistical signature of many-body quantum interference

    Science.gov (United States)

    Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio

    2018-03-01

    Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.

  12. Experimental designs for autoregressive models applied to industrial maintenance

    International Nuclear Information System (INIS)

    Amo-Salas, M.; López-Fidalgo, J.; Pedregal, D.J.

    2015-01-01

    Some time series applications require data which are either expensive or technically difficult to obtain. In such cases scheduling the points in time at which the information should be collected is of paramount importance in order to optimize the resources available. In this paper time series models are studied from a new perspective, consisting in the use of Optimal Experimental Design setup to obtain the best times to take measurements, with the principal aim of saving costs or discarding useless information. The model and the covariance function are expressed in an explicit form to apply the usual techniques of Optimal Experimental Design. Optimal designs for various approaches are computed and their efficiencies are compared. The methods working in an application of industrial maintenance of a critical piece of equipment at a petrochemical plant are shown. This simple model allows explicit calculations in order to show openly the procedure to find the correlation structure, needed for computing the optimal experimental design. In this sense the techniques used in this paper to compute optimal designs may be transferred to other situations following the ideas of the paper, but taking into account the increasing difficulty of the procedure for more complex models. - Highlights: • Optimal experimental design theory is applied to AR models to reduce costs. • The first observation has an important impact on any optimal design. • Either the lack of precision or small starting observations claim for large times. • Reasonable optimal times were obtained relaxing slightly the efficiency. • Optimal designs were computed in a predictive maintenance context

  13. Development and optimization of fast dissolving oro-dispersible films of granisetron HCl using Box–Behnken statistical design

    Directory of Open Access Journals (Sweden)

    Hema Chaudhary

    2013-12-01

    Full Text Available The aim was to develop and optimize fast dissolving oro-dispersible films of granisetron hydrochloride (GH by two-factor, three-level Box–Behnken design as the two independent variables such as X1 (polymer and X2 (plasticizer were selected on the basis of the preliminary studies carried out before the experimental design is being implemented. A second-order polynomial equation to construct contour plots for the prediction of responses of the dependent variables such as drug release (Y1, Disintegration time (Y2, and Y3 (Tensile strength was studied. The Response surface plots were drawn, statistical validity of the polynomials was established to find the compositions of optimized formulation which was evaluated using the Franz-type diffusion cell. The designs establish the role of the derived polynomial equation and contour plots in predicting the values of dependent variables for the preparation and optimization.

  14. Experimental signature for statistical multifragmentation

    International Nuclear Information System (INIS)

    Moretto, L.G.; Delis, D.N.; Wozniak, G.J.

    1993-01-01

    Multifragment production was measured for the 60 MeV/nucleon 197 Au+ 27 Al, 51 V, and nat Cu reactions. The branching ratios for binary, ternary, quaternary, and quinary decays were determined as a function of the excitation energy E and are independent of the target. The logarithms of these branching ratios when plotted vs E -1/2 show a linear dependence that strongly suggests a statistical competition between the various multifragmentation channels. This behavior seems to relegate the role of dynamics to the formation of the sources, which then proceed to decay in an apparently statistical manner

  15. A Modified Jonckheere Test Statistic for Ordered Alternatives in Repeated Measures Design

    Directory of Open Access Journals (Sweden)

    Hatice Tül Kübra AKDUR

    2016-09-01

    Full Text Available In this article, a new test based on Jonckheere test [1] for  randomized blocks which have dependent observations within block is presented. A weighted sum for each block statistic rather than the unweighted sum proposed by Jonckheereis included. For Jonckheere type statistics, the main assumption is independency of observations within block. In the case of repeated measures design, the assumption of independence is violated. The weighted Jonckheere type statistic for the situation of dependence for different variance-covariance structure and the situation based on ordered alternative hypothesis structure of each block on the design is used. Also, the proposed statistic is compared to the existing test based on Jonckheere in terms of type I error rates by performing Monte Carlo simulation. For the strong correlations, circular bootstrap version of the proposed Jonckheere test provides lower rates of type I error.

  16. Quasi-experimental study designs series-paper 1: introduction: two historical lineages.

    Science.gov (United States)

    Bärnighausen, Till; Røttingen, John-Arne; Rockers, Peter; Shemilt, Ian; Tugwell, Peter

    2017-09-01

    The objective of this study was to contrast the historical development of experiments and quasi-experiments and provide the motivation for a journal series on quasi-experimental designs in health research. A short historical narrative, with concrete examples, and arguments based on an understanding of the practice of health research and evidence synthesis. Health research has played a key role in developing today's gold standard for causal inference-the randomized controlled multiply blinded trial. Historically, allocation approaches developed from convenience and purposive allocation to alternate and, finally, to random allocation. This development was motivated both by concerns for manipulation in allocation as well as statistical and theoretical developments demonstrating the power of randomization in creating counterfactuals for causal inference. In contrast to the sequential development of experiments, quasi-experiments originated at very different points in time, from very different scientific perspectives, and with frequent and long interruptions in their methodological development. Health researchers have only recently started to recognize the value of quasi-experiments for generating novel insights on causal relationships. While quasi-experiments are unlikely to replace experiments in generating the efficacy and safety evidence required for clinical guidelines and regulatory approval of medical technologies, quasi-experiments can play an important role in establishing the effectiveness of health care practice, programs, and policies. The papers in this series describe and discuss a range of important issues in utilizing quasi-experimental designs for primary research and quasi-experimental results for evidence synthesis. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Experimental Design of Electrocoagulation and Magnetic Technology for Enhancing Suspended Solids Removal from Synthetic Wastewater

    Directory of Open Access Journals (Sweden)

    Moh Faiqun Ni'am

    2014-10-01

    Full Text Available Design of experiments (DOE is one of the statistical method that is used as a tool to enhance and improve experimental quality. The changes to the variables of a process or system is supposed to give the optimal result (response and quite satisfactory. Experimental design can defined as a test or series of test series by varying the input variables (factors of a process that can known to cause changes in output (response. This paper presents the results of experimental design of wastewater treatment by electrocoagulation (EC technique. A combined magnet and electrocoagulation (EC technology were designed to increase settling velocity and to enhance suspended solid removal efficiencies from wastewater samples. In this experiment, a synthetic wastewater samples were prepared by mixing 700 mg of the milk powder in one litre of water and treated by using an acidic buffer solution. The monopolar iron (Fe plate anodes and cathodes were employed as electrodes. Direct current was varied in a range of between 0.5 and 1.1 A, and flowrate in a range of between 1.00 to 3.50 mL/s. One permanent magnets namely AlNiCo with a magnetic strength of 0.16T was used in this experiment. The results show that the magnetic field and the flowrate have major influences on suspended solids removal. The efficiency removals of suspended solids, turbidity and COD removal efficiencies at optimum conditions were found to be more than 85%, 95%, and 75%, respectively.

  18. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  19. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship – Quasi-Experimental Designs

    Science.gov (United States)

    Schweizer, Marin L.; Braun, Barbara I.; Milstone, Aaron M.

    2016-01-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt non-randomized interventions. Quasi-experimental studies can be categorized into three major types: interrupted time series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. PMID:27267457

  20. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  1. Design activities of a fusion experimental breeder

    International Nuclear Information System (INIS)

    Huang, J.; Feng, K.; Sheng, G.

    1999-01-01

    The fusion reactor design studies in China are under the support of a fusion-fission hybrid reactor research Program. The purpose of this program is to explore the potential near-term application of fusion energy to support the long-term fusion energy on the one hand and the fission energy development on the other. During 1992-1996 a detailed consistent and integral conceptual design of a Fusion Experimental Breeder, FEB was completed. Beginning from 1996, a further design study towards an Engineering Outline Design of the FEB, FEB-E, has started. The design activities are briefly given. (author)

  2. Design activities of a fusion experimental breeder

    International Nuclear Information System (INIS)

    Huang, J.; Feng, K.; Sheng, G.

    2001-01-01

    The fusion reactor design studies in China are under the support of a fusion-fission hybrid reactor research Program. The purpose of this program is to explore the potential near-term application of fusion energy to support the long-term fusion energy on the one hand and the fission energy development on the other. During 1992-1996 a detailed consistent and integral conceptual design of a Fusion Experimental Breeder, FEB was completed. Beginning from 1996, a further design study towards an Engineering Outline Design of the FEB, FEB-E, has started. The design activities are briefly given. (author)

  3. An Introduction to Experimental Design Research

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2016-01-01

    Design research brings together influences from the whole gamut of social, psychological, and more technical sciences to create a tradition of empirical study stretching back over 50 years (Horvath 2004; Cross 2007). A growing part of this empirical tradition is experimental, which has gained in ...

  4. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    Science.gov (United States)

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-24

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  5. Status of experimental data for the VHTR core design

    Energy Technology Data Exchange (ETDEWEB)

    Park, Won Seok; Chang, Jong Hwa; Park, Chang Kue

    2004-05-01

    The VHTR (Very High Temperature Reactor) is being emerged as a next generation nuclear reactor to demonstrate emission-free nuclear-assisted electricity and hydrogen production. The VHTR could be either a prismatic or pebble type helium cooled, graphite moderated reactor. The final decision will be made after the completion of the pre-conceptual design for each type. For the pre-conceptual design for both types, computational tools are being developed. Experimental data are required to validate the tools to be developed. Many experiments on the HTGR (High Temperature Gas-cooled Reactor) cores have been performed to confirm the design data and to validate the design tools. The applicability and availability of the existing experimental data have been investigated for the VHTR core design in this report.

  6. Using factorial experimental design to evaluate the separation of plastics by froth flotation.

    Science.gov (United States)

    Salerno, Davide; Jordão, Helga; La Marca, Floriana; Carvalho, M Teresa

    2018-03-01

    This paper proposes the use of factorial experimental design as a standard experimental method in the application of froth flotation to plastic separation instead of the commonly used OVAT method (manipulation of one variable at a time). Furthermore, as is common practice in minerals flotation, the parameters of the kinetic model were used as process responses rather than the recovery of plastics in the separation products. To explain and illustrate the proposed methodology, a set of 32 experimental tests was performed using mixtures of two polymers with approximately the same density, PVC and PS (with mineral charges), with particle size ranging from 2 to 4 mm. The manipulated variables were frother concentration, air flow rate and pH. A three-level full factorial design was conducted. The models establishing the relationships between the manipulated variables and their interactions with the responses (first order kinetic model parameters) were built. The Corrected Akaike Information Criterion was used to select the best fit model and an analysis of variance (ANOVA) was conducted to identify the statistically significant terms of the model. It was shown that froth flotation can be used to efficiently separate PVC from PS with mineral charges by reducing the floatability of PVC, which largely depends on the action of pH. Within the tested interval, this is the factor that most affects the flotation rate constants. The results obtained show that the pure error may be of the same magnitude as the sum of squares of the errors, suggesting that there is significant variability within the same experimental conditions. Thus, special care is needed when evaluating and generalizing the process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    Science.gov (United States)

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Experimental site and design

    Energy Technology Data Exchange (ETDEWEB)

    Guenette, C. C. [SINTEF Applied Cemistry, Trondheim (Norway)

    1999-08-01

    Design and site selection criteria for the Svalbard oil spill experiments are described. All three experimental sites have coarse and mixed sediment beaches of sand and pebble; within each site wave exposure is very similar; along-shore and across-shore sediment characteristics are also relatively homogeneous. Tidal range is in the order of 0.6 m at neaps, and 1.8 m at springs. All three sites are open to wave action and are ice-free during the experimental period of mid-July to mid-October. Study plots at each site were selected for different treatments from within the continuous stretch of oiled shoreline, with oiled buffer zones between plots and at either end of the oiled zone. Treatments included mixing (tilling), sediment relocation (surf washing) and bioremediation (nutrient enrichment). Measurements and observations were carried out during the summers of 1997 and 1998. The characteristics measured were: wave and wind conditions; beach topography and elevation; sediment grain size distribution; mineral fines size distribution and mineral composition; background hydrocarbons; concentration of oil within experimental plots and the rate of oil loss over time; depth of oil penetration and thickness of the oiled sediment layer; oil concentration and toxicity of near-shore benthic sediments; mineral composition of suspended particulate material captured in sub-tidal sediment traps; and oil-fines interaction in near-shore water samples. 1 fig.

  9. Experimental site and design

    Energy Technology Data Exchange (ETDEWEB)

    Guenette, C. C. [SINTEF Applied Cemistry, Trondheim (Norway)

    1999-07-01

    Design and site selection criteria for the Svalbard oil spill experiments are described. All three experimental sites have coarse and mixed sediment beaches of sand and pebble; within each site waveexposure is very similar; along-shore and across-shore sediment characteristics are also relatively homogeneous. Tidal range is in the order of 0.6 m at neaps, and 1.8 m at springs. All three sites are open to wave action and are ice-free during the experimental period of mid-July to mid-October. Study plots at each site were selected for different treatments from within the continuous stretch of oiled shoreline, with oiled buffer zones between plots and at either end of the oiled zone. Treatments included mixing (tilling), sediment relocation (surf washing) and bioremediation (nutrient enrichment). Measurements and observations were carried out during the summers of 1997 and 1998. The characteristics measured were: wave and wind conditions; beach topography and elevation; sediment grain size distribution; mineral fines size distribution and mineral composition; background hydrocarbons; concentration of oil within experimental plots and the rate of oil loss over time; depth of oil penetration and thickness of the oiled sediment layer; oil concentration and toxicity of near-shore benthic sediments; mineral composition of suspended particulate material captured in sub-tidal sediment traps; and oil-fines interaction in near-shore water samples. 1 fig.

  10. iCFD: Interpreted Computational Fluid Dynamics – Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design – The secondary clarifier

    DEFF Research Database (Denmark)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat

    2015-01-01

    using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor...... of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization...

  11. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    Science.gov (United States)

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  12. Multi-criteria optimization of the flesh melons skin separation process by experimental and statistical analysis methods

    Directory of Open Access Journals (Sweden)

    Y. B. Medvedkov

    2016-01-01

    Full Text Available Research and innovation activity to create energy-efficient processes in the melon processing, is a significant task. Separation skin from the melon flesh with their subsequent destination application in the creation of new food products is one of the time-consuming operations in this technology. Lack of scientific and experimental base of this operation holding back the development of high-performance machines for its implementation. In this connection, the technique of the experiment on the separation of the skins of melons in the pilot plant and the search for optimal regimes of its work methods by statistical modeling is offered. The late-ripening species of melon: Kalaysan, Thorlami, Gulab-sary are objects of study. Interaction of factors influencing on separating the melon skins process is carried out. A central composite rotatable design and fractional factorial experiment was used. Using the method of experimental design with treatment planning template in Design Expert v.10 software yielded a regression equations that adequately describe the actual process. Rational intervals input factors values are established: the ratio of the rotational speed of the drum to the abrasive supply roll rotational frequency; the gap between the supply drum and the shearing knife; shearing blade sharpening angle; the number of feed drum spikes; abrading drum orifices diameter. The mean square error does not exceed 12.4%. Regression equations graphic interpretation is presented by scatter plots and engineering nomograms that can be predictive of a choice of rational values of the input factors for three optimization criteria: minimal specific energy consumption in the process of cutting values, maximal specific performance by the pulp and pulp extraction ratio values. Obtained data can be used for the operational management of the process technological parameters, taking into account the geometrical dimensions of the melon and its inhomogeneous structure.

  13. Optimization and evaluation of clarithromycin floating tablets using experimental mixture design.

    Science.gov (United States)

    Uğurlu, Timucin; Karaçiçek, Uğur; Rayaman, Erkan

    2014-01-01

    The purpose of the study was to prepare and evaluate clarithromycin (CLA) floating tablets using experimental mixture design for treatment of Helicobacter pylori provided by prolonged gastric residence time and controlled plasma level. Ten different formulations were generated based on different molecular weight of hypromellose (HPMC K100, K4M, K15M) by using simplex lattice design (a sub-class of mixture design) with Minitab 16 software. Sodium bicarbonate and anhydrous citric acid were used as gas generating agents. Tablets were prepared by wet granulation technique. All of the process variables were fixed. Results of cumulative drug release at 8th h (CDR 8th) were statistically analyzed to get optimized formulation (OF). Optimized formulation, which gave floating lag time lower than 15 s and total floating time more than 10 h, was analyzed and compared with target for CDR 8th (80%). A good agreement was shown between predicted and actual values of CDR 8th with a variation lower than 1%. The activity of clarithromycin contained optimizedformula against H. pylori were quantified using well diffusion agar assay. Diameters of inhibition zones vs. log10 clarithromycin concentrations were plotted in order to obtain a standard curve and clarithromycin activity.

  14. System design overview of JAXA small supersonic experimental airplane (NEXST-1)

    OpenAIRE

    Takami, Hikaru; 高見 光

    2007-01-01

    The system of JAXA small supersonic experimental airplane (NEXST-1: National EXperimental Supersonic Transport-1) has been briefly explained. Some design problems that the designers have encountered have also been briefly explained.

  15. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Design of an experimental model to study the behavior of unsaturated fats in the preparation of meat emulsions

    Directory of Open Access Journals (Sweden)

    Javier F. Rey

    2009-08-01

    Full Text Available The essence of a good experimental position consists on projecting an experiment so that him it is able to in fact give therefore the type of information that is looked for, by means of the development of the present work it is looked for to determine which the quality of the meat products will be elaborated with unsaturated vegetable fats, which its yield will be and for ende its cost regarding the traditional products, in and of itself the present investigation outlines an experimental design by means of the control of such variables as type of fat, use temperature and time of cutteado, keeping in mind the physiochemical and biochemical phenomena that happen beginning the control from the composition of the meat and fat during the trial, as raw materials dedicated to this end, for he/she thought about it the experimental design using a statistical model of complete factorial planning with 3 variables and 2 levels for a number of 15 rehearsals with a replica. Identified the variables to control as type of fat, temperature of use of the fats and the time of cutteado, the outlined experimental design is applied and you ends up obtaining the equation that gives solution to the identified problem that facilitates to use the unsaturated fats inside a process of elaboration of meat emulsions.

  17. Design and experimentation of BSFQ logic devices

    International Nuclear Information System (INIS)

    Hosoki, T.; Kodaka, H.; Kitagawa, M.; Okabe, Y.

    1999-01-01

    Rapid single flux quantum (RSFQ) logic needs synchronous pulses for each gate, so the clock-wiring problem is more serious when designing larger scale circuits with this logic. So we have proposed a new SFQ logic which follows Boolean algebra perfectly by using set and reset pulses. With this logic, the level information of current input is transmitted with these pulses generated by level-to-pulse converters, and each gate calculates logic using its phase level made by these pulses. Therefore, our logic needs no clock in each gate. We called this logic 'Boolean SFQ (BSFQ) logic'. In this paper, we report design and experimentation for an AND gate with inverting input based on BSFQ logic. The experimental results for OR and XOR gates are also reported. (author)

  18. Using IMPRINT to Guide Experimental Design with Simulated Task Environments

    Science.gov (United States)

    2015-06-18

    USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN OF SIMULATED TASK ENVIRONMENTS THESIS Gregory...ENG-MS-15-J-052 USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN WITH SIMULATED TASK ENVIRONMENTS THESIS Presented to the Faculty Department...Civilian, USAF June 2015 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-J-052 USING IMPRINT

  19. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  20. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    , sample extraction, and analytical methods to be used in the INL-2 study. For each of the five test events, the specified floor of the INL building will be contaminated with BG using a point-release device located in the room specified in the experimental design. Then quality control (QC), reference material coupon (RMC), judgmental, and probabilistic samples will be collected according to the sampling plan for each test event. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples were selected with a random aspect and in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated. For possibly contaminated areas, the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using the CJR approach. The numbers of samples were chosen to support making X%/Y% clearance statements with X = 95% or 99% and Y = 96% or 97%. The experimental and sampling design also provides for making X%/Y% clearance statements using only probabilistic samples. For each test event, the numbers of characterization and clearance samples were selected within limits based on operational considerations while still maintaining high confidence for detection and clearance aspects. The sampling design for all five test events contains 2085 samples, with 1142 after contamination and 943 after decontamination. These numbers include QC, RMC, judgmental, and probabilistic samples. The experimental and sampling design specified in this report provides a good statistical foundation for achieving the objectives of the INL-2 study.

  1. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  2. Design and Experimental Study on Spinning Solid Rocket Motor

    Science.gov (United States)

    Xue, Heng; Jiang, Chunlan; Wang, Zaicheng

    The study on spinning solid rocket motor (SRM) which used as power plant of twice throwing structure of aerial submunition was introduced. This kind of SRM which with the structure of tangential multi-nozzle consists of a combustion chamber, propellant charge, 4 tangential nozzles, ignition device, etc. Grain design, structure design and prediction of interior ballistic performance were described, and problem which need mainly considered in design were analyzed comprehensively. Finally, in order to research working performance of the SRM, measure pressure-time curve and its speed, static test and dynamic test were conducted respectively. And then calculated values and experimental data were compared and analyzed. The results indicate that the designed motor operates normally, and the stable performance of interior ballistic meet demands. And experimental results have the guidance meaning for the pre-research design of SRM.

  3. Effect of experimental factors on magnetic properties of nickel nanoparticles produced by chemical reduction method using a statistical design

    International Nuclear Information System (INIS)

    Vaezi, M.R.; Barzgar Vishlaghi, M.; Farzalipour Tabriz, M.; Mohammad Moradi, O.

    2015-01-01

    Highlights: • Superparamagnetic nickel nanoparticles are synthesized by wet chemical reduction. • Effects of synthesis parameters on magnetic properties are studied. • Central composite experimental design is used for building an empirical model. • Solvents ratio was more influential than reactants mixing rate. - Abstract: Nickel nanoparticles were synthesized by chemical reduction method in the absence of any surface capping agent. The effect of reactants mixing rate and the volume ratio of methanol/ethanol as solvent on the morphology and magnetic properties of nickel nanoparticles were studied by design of experiment using central composite design. X-ray diffraction (XRD) technique and Transmission Electron Microscopy (TEM) were utilized to characterize the synthesized nanoparticles. Size distribution of particles was studied by Dynamic Light Scattering (DLS) technique and magnetic properties of produced nanoparticles were investigated by Vibrating Sample Magnetometer (VSM) apparatus. The results showed that the magnetic properties of nickel nanoparticles were more influenced by volume ratio of methanol/ethanol than the reactants mixing rate. Super-paramagnetic nickel nanoparticles with size range between 20 and 50 nm were achieved when solvent was pure methanol and the reactants mixing rate was kept at 70 ml/h. But addition of more ethanol to precursor solvent leads to the formation of larger particles with broader size distribution and weak ferromagnetic or super-paramagnetic behavior

  4. A course in statistics with R

    CERN Document Server

    Tattar, Prabhanjan N; Manjunath, B G

    2016-01-01

    Integrates the theory and applications of statistics using R A Course in Statistics with R has been written to bridge the gap between theory and applications and explain how mathematical expressions are converted into R programs. The book has been primarily designed as a useful companion for a Masters student during each semester of the course, but will also help applied statisticians in revisiting the underpinnings of the subject. With this dual goal in mind, the book begins with R basics and quickly covers visualization and exploratory analysis. Probability and statistical inference, inclusive of classical, nonparametric, and Bayesian schools, is developed with definitions, motivations, mathematical expression and R programs in a way which will help the reader to understand the mathematical development as well as R implementation. Linear regression models, experimental designs, multivariate analysis, and categorical data analysis are treated in a way which makes effective use of visualization techniques and...

  5. Quasi-experimental study designs series-paper 10: synthesizing evidence for effects collected from quasi-experimental studies presents surmountable challenges.

    Science.gov (United States)

    Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter

    2017-09-01

    To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  7. Statistical evaluation of design-error related nuclear reactor accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1981-01-01

    In this paper, general methodology for the statistical evaluation of design-error related accidents is proposed that can be applied to a variety of systems that evolves during the development of large-scale technologies. The evaluation aims at an estimate of the combined ''residual'' frequency of yet unknown types of accidents ''lurking'' in a certain technological system. A special categorization in incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of U.S. nuclear power reactor technology, considering serious accidents (category 2 events) that involved, in the accident progression, a particular design inadequacy. 9 refs

  8. Statistical Change Detection for Diagnosis of Buoyancy Element Defects on Moored Floating Vessels

    DEFF Research Database (Denmark)

    Blanke, Mogens; Fang, Shaoji; Galeazzi, Roberto

    2012-01-01

    . After residual generation, statistical change detection scheme is derived from mathematical models supported by experimental data. To experimentally verify loss of an underwater buoyancy element, an underwater line breaker is designed to create realistic replication of abrupt faults. The paper analyses...... the properties of residuals and suggests a dedicated GLRT change detector based on a vector residual. Special attention is paid to threshold selection for non ideal (non-IID) test statistics....

  9. Application of Statistical Design to the Optimization of Culture Medium for Prodigiosin Production by Serratia marcescens SWML08

    Directory of Open Access Journals (Sweden)

    Venil, C. K.

    2009-01-01

    Full Text Available Combination of Plackett – Burman design (PBD and Box – Behnken design (BBD were applied for optimization of different factors for prodigiosin production by Serratia marcescens SWML08. Among 11 factors, incubation temperature, and supplement of (NH42PO4 and trace salts into the culture medium were selected due to significant positive effect on prodigiosin yield. Box - Behnken design, a response surface methodology, was used for further optimization of these selected factors for better prodigiosin output. Data were analyzed step wise and a second order polynomial model was established to identify the relationship between the prodigiosin output and the selected factors. The media formulations were optimized having the factors such as incubation temperature 30 °C, (NH42PO4 6 g/L and trace salts 0.6 g/L. The maximum experimental response for prodigiosin production was 1397.96 mg/L whereas the predicted value was 1394.26 mg/L. The high correlation between the predicted and observed values indicated the validity of the statistical design.

  10. A statistical experimental design approach to evaluate the influence of various penetration enhancers on transdermal drug delivery of buprenorphine

    Directory of Open Access Journals (Sweden)

    S.Mojtaba Taghizadeh

    2015-03-01

    Full Text Available A series of drug-in-adhesive transdermal drug delivery systems (patch with different chemical penetration enhancers were designed to deliver drug through the skin as a site of application. The objective of our effort was to study the influence of various chemical penetration enhancers on skin permeation rate and adhesion properties of a transdermal drug delivery system using Box–Behnken experimental design. The response surface methodology based on a three-level, three-variable Box–Behnken design was used to evaluate the interactive effects on dependent variables including, the rate of skin permeation and adhesion properties, namely peel strength and tack value. Levulinic acid, lauryl alcohol, and Tween 80 were used as penetration enhancers (patch formulations, containing 0–8% of each chemical penetration enhancer. Buprenorphine was used as a model penetrant drug. The results showed that incorporation of 20% chemical penetration enhancer into the mixture led to maximum skin permeation flux of buprenorphine from abdominal rat skin while the adhesion properties decreased. Also that skin flux in presence of levulinic acid (1.594 μg/cm2 h was higher than Tween 80 (1.473 μg/cm2 h and lauryl alcohol (0.843 μg/cm2 h, and in mixing these enhancers together, an additional effect was observed. Moreover, it was found that each enhancer increased the tack value, while levulinic acid and lauryl alcohol improved the peel strength but Tween 80 reduced it. These findings indicated that the best chemical skin penetration enhancer for buprenorphine patch was levulinic acid. Among the designed formulations, the one which contained 12% (wt/wt enhancers exhibited the highest efficiency.

  11. A statistical experimental design approach to evaluate the influence of various penetration enhancers on transdermal drug delivery of buprenorphine.

    Science.gov (United States)

    Taghizadeh, S Mojtaba; Moghimi-Ardakani, Ali; Mohamadnia, Fatemeh

    2015-03-01

    A series of drug-in-adhesive transdermal drug delivery systems (patch) with different chemical penetration enhancers were designed to deliver drug through the skin as a site of application. The objective of our effort was to study the influence of various chemical penetration enhancers on skin permeation rate and adhesion properties of a transdermal drug delivery system using Box-Behnken experimental design. The response surface methodology based on a three-level, three-variable Box-Behnken design was used to evaluate the interactive effects on dependent variables including, the rate of skin permeation and adhesion properties, namely peel strength and tack value. Levulinic acid, lauryl alcohol, and Tween 80 were used as penetration enhancers (patch formulations, containing 0-8% of each chemical penetration enhancer). Buprenorphine was used as a model penetrant drug. The results showed that incorporation of 20% chemical penetration enhancer into the mixture led to maximum skin permeation flux of buprenorphine from abdominal rat skin while the adhesion properties decreased. Also that skin flux in presence of levulinic acid (1.594 μg/cm(2) h) was higher than Tween 80 (1.473 μg/cm(2) h) and lauryl alcohol (0.843 μg/cm(2) h), and in mixing these enhancers together, an additional effect was observed. Moreover, it was found that each enhancer increased the tack value, while levulinic acid and lauryl alcohol improved the peel strength but Tween 80 reduced it. These findings indicated that the best chemical skin penetration enhancer for buprenorphine patch was levulinic acid. Among the designed formulations, the one which contained 12% (wt/wt) enhancers exhibited the highest efficiency.

  12. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  13. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  14. Statistical analysis of thermal conductivity of nanofluid containing ...

    Indian Academy of Sciences (India)

    Thermal conductivity measurements of nanofluids were analysed via two-factor completely randomized design and comparison of data means is carried out with Duncan's multiple-range test. Statistical analysis of experimental data show that temperature and weight fraction have a reasonable impact on the thermal ...

  15. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  16. Optimizing Nuclear Reaction Analysis (NRA) using Bayesian Experimental Design

    International Nuclear Information System (INIS)

    Toussaint, Udo von; Schwarz-Selinger, Thomas; Gori, Silvio

    2008-01-01

    Nuclear Reaction Analysis with 3 He holds the promise to measure Deuterium depth profiles up to large depths. However, the extraction of the depth profile from the measured data is an ill-posed inversion problem. Here we demonstrate how Bayesian Experimental Design can be used to optimize the number of measurements as well as the measurement energies to maximize the information gain. Comparison of the inversion properties of the optimized design with standard settings reveals huge possible gains. Application of the posterior sampling method allows to optimize the experimental settings interactively during the measurement process.

  17. Software package for analysis of completely randomized block design

    African Journals Online (AJOL)

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  18. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  19. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    Science.gov (United States)

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  20. Designing experiments for maximum information from cyclic oxidation tests and their statistical analysis using half Normal plots

    International Nuclear Information System (INIS)

    Coleman, S.Y.; Nicholls, J.R.

    2006-01-01

    Cyclic oxidation testing at elevated temperatures requires careful experimental design and the adoption of standard procedures to ensure reliable data. This is a major aim of the 'COTEST' research programme. Further, as such tests are both time consuming and costly, in terms of human effort, to take measurements over a large number of cycles, it is important to gain maximum information from a minimum number of tests (trials). This search for standardisation of cyclic oxidation conditions leads to a series of tests to determine the relative effects of cyclic parameters on the oxidation process. Following a review of the available literature, databases and the experience of partners to the COTEST project, the most influential parameters, upper dwell temperature (oxidation temperature) and time (hot time), lower dwell time (cold time) and environment, were investigated in partners' laboratories. It was decided to test upper dwell temperature at 3 levels, at and equidistant from a reference temperature; to test upper dwell time at a reference, a higher and a lower time; to test lower dwell time at a reference and a higher time and wet and dry environments. Thus an experiment, consisting of nine trials, was designed according to statistical criteria. The results of the trial were analysed statistically, to test the main linear and quadratic effects of upper dwell temperature and hot time and the main effects of lower dwell time (cold time) and environment. The nine trials are a quarter fraction of the 36 possible combinations of parameter levels that could have been studied. The results have been analysed by half Normal plots as there are only 2 degrees of freedom for the experimental error variance, which is rather low for a standard analysis of variance. Half Normal plots give a visual indication of which factors are statistically significant. In this experiment each trial has 3 replications, and the data are analysed in terms of mean mass change, oxidation kinetics

  1. Experimental Design for Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2001-01-01

    This introductory tutorial gives a survey on the use of statistical designs for what if-or sensitivity analysis in simulation.This analysis uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as

  2. Statistical Optimisation of Fermentation Conditions for Citric Acid ...

    African Journals Online (AJOL)

    This study investigated the optimisation of fermentation conditions during citric acid production via solid state fermentation (SSF) of pineapple peels using Aspergillus niger. A three-variable, three-level Box-Behnken design (BBD) comprising 17 experimental runs was used to develop a statistical model for the fermentation ...

  3. Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review.

    Science.gov (United States)

    Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D

    2018-02-01

    OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (Pquasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.

  4. Visualizing Experimental Designs for Balanced ANOVA Models using Lisp-Stat

    Directory of Open Access Journals (Sweden)

    Philip W. Iversen

    2004-12-01

    Full Text Available The structure, or Hasse, diagram described by Taylor and Hilton (1981, American Statistician provides a visual display of the relationships between factors for balanced complete experimental designs. Using the Hasse diagram, rules exist for determining the appropriate linear model, ANOVA table, expected means squares, and F-tests in the case of balanced designs. This procedure has been implemented in Lisp-Stat using a software representation of the experimental design. The user can interact with the Hasse diagram to add, change, or delete factors and see the effect on the proposed analysis. The system has potential uses in teaching and consulting.

  5. Bioinspiration: applying mechanical design to experimental biology.

    Science.gov (United States)

    Flammang, Brooke E; Porter, Marianne E

    2011-07-01

    The production of bioinspired and biomimetic constructs has fostered much collaboration between biologists and engineers, although the extent of biological accuracy employed in the designs produced has not always been a priority. Even the exact definitions of "bioinspired" and "biomimetic" differ among biologists, engineers, and industrial designers, leading to confusion regarding the level of integration and replication of biological principles and physiology. By any name, biologically-inspired mechanical constructs have become an increasingly important research tool in experimental biology, offering the opportunity to focus research by creating model organisms that can be easily manipulated to fill a desired parameter space of structural and functional repertoires. Innovative researchers with both biological and engineering backgrounds have found ways to use bioinspired models to explore the biomechanics of organisms from all kingdoms to answer a variety of different questions. Bringing together these biologists and engineers will hopefully result in an open discourse of techniques and fruitful collaborations for experimental and industrial endeavors.

  6. Summary of the experimental multi-purpose very high temperature gas cooled reactor design

    International Nuclear Information System (INIS)

    1984-12-01

    The report presents the design of Multi-purpose Very High Temperature Gas Cooled Reactor (the Experimental VHTR) based on the second stage of detailed design which was completed on March 1984, in the from of ''An application of reactor construction permit Appendix 8''. The Experimental VHTR is designed to satisfy with the design specification for the reactor thermal output 50 MW and reactor outlet temperature 950 0 C. The adequacy of the design is also checked by the safety analysis. The planning of plant system and safety is summarized such as safety design requirements and conformance with them, seismic design and plant arrangement. Concerning with the system of the Experimental VHTR the design basis, design data and components are described in the order. (author)

  7. Design Issues and Inference in Experimental L2 Research

    Science.gov (United States)

    Hudson, Thom; Llosa, Lorena

    2015-01-01

    Explicit attention to research design issues is essential in experimental second language (L2) research. Too often, however, such careful attention is not paid. This article examines some of the issues surrounding experimental L2 research and its relationships to causal inferences. It discusses the place of research questions and hypotheses,…

  8. Use of Experimental Design for Peuhl Cheese Process Optimization ...

    African Journals Online (AJOL)

    Use of Experimental Design for Peuhl Cheese Process Optimization. ... Journal of Applied Sciences and Environmental Management ... This work consisting in use of a central composite design enables the determination of optimal process conditions concerning: leaf extract volume added (7 mL), heating temperature ...

  9. Experimentally supported control design for a direct drive robot

    NARCIS (Netherlands)

    Kostic, D.; Jager, de A.G.; Steinbuch, M.

    2002-01-01

    We promote the idea of an experimentally supported control design as a successful way to achieve accurate tracking of reference robot motions, under disturbance conditions and given the uncertainties arising from modeling errors. The Hinf robust control theory is used for design of motion

  10. Approach toward enhancement of halophilic protease production by Halobacterium sp. strain LBU50301 using statistical design response surface methodology.

    Science.gov (United States)

    Chuprom, Julalak; Bovornreungroj, Preeyanuch; Ahmad, Mehraj; Kantachote, Duangporn; Dueramae, Sawitree

    2016-06-01

    A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples ( budu ) and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT) approach determined gelatin was the best nitrogen source. Based on Plackett - Burman (PB) experimental design; gelatin, MgSO 4 ·7H 2 O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD) determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL) was obtained, compared with that produced in the original medium (17.80 U/mL). Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL).

  11. Approach toward enhancement of halophilic protease production by Halobacterium sp. strain LBU50301 using statistical design response surface methodology

    Directory of Open Access Journals (Sweden)

    Julalak Chuprom

    2016-06-01

    Full Text Available A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples (budu and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT approach determined gelatin was the best nitrogen source. Based on Plackett–Burman (PB experimental design; gelatin, MgSO4·7H2O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL was obtained, compared with that produced in the original medium (17.80 U/mL. Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL.

  12. Development and Validation of a Rubric for Diagnosing Students’ Experimental Design Knowledge and Difficulties

    Science.gov (United States)

    Dasgupta, Annwesa P.; Anderson, Trevor R.

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658

  13. Conceptual design of fusion experimental reactor (FER)

    International Nuclear Information System (INIS)

    1984-03-01

    A conceptual design study (option C) has been carried out for the fusion experimental reactor (FER). In addition to design of the tokamak reactor and associated systems based on the reference design specifications, feasibility of a water-shield reactor concept was examined as a topical study. The design study for the reference tokamak reactor has produced a reactor concept for the FER, along with major R D items for the concept, based on close examinations on thermal design, electromagnetics, neutronics and remote maintenance. Particular efforts have been directed to the area of electromagnetics. Detailed analyses with close simulation models have been performed on PF coil arrangements and configurations, shell effects of the blanket for plasma position unstability, feedback control, and eddy currents during disruptions. The major design specifications are as follows; Peak fusion power 437 MW Major radius 5.5 m Minor radius 1.1 m Plasma elongation 1.5 Plasma current 5.3 MA Toroidal beta 4 % Field on axis 5.7 T (author)

  14. Testing the Developmental Origins of Health and Disease Hypothesis for Psychopathology Using Family-Based Quasi-Experimental Designs

    Science.gov (United States)

    D’Onofrio, Brian M.; Class, Quetzal A.; Lahey, Benjamin B.; Larsson, Henrik

    2014-01-01

    The Developmental Origin of Health and Disease (DOHaD) hypothesis is a broad theoretical framework that emphasizes how early risk factors have a causal influence on psychopathology. Researchers have raised concerns about the causal interpretation of statistical associations between early risk factors and later psychopathology because most existing studies have been unable to rule out the possibility of environmental and genetic confounding. In this paper we illustrate how family-based quasi-experimental designs can test the DOHaD hypothesis by ruling out alternative hypotheses. We review the logic underlying sibling-comparison, co-twin control, offspring of siblings/twins, adoption, and in vitro fertilization designs. We then present results from studies using these designs focused on broad indices of fetal development (low birth weight and gestational age) and a particular teratogen, smoking during pregnancy. The results provide mixed support for the DOHaD hypothesis for psychopathology, illustrating the critical need to use design features that rule out unmeasured confounding. PMID:25364377

  15. Conceptual design of Fusion Experimental Reactor (FER)

    International Nuclear Information System (INIS)

    Tone, T.; Fujisawa, N.

    1983-01-01

    Conceptual design studies of the Fusion Experimental Reactor (FER) have been performed. The FER has an objective of achieving selfignition and demonstrating engineering feasibility as a next generation tokamak to JT-60. Various concepts of the FER have been considered. The reference design is based on a double-null divertor. Optional design studies with some attractive features based on advanced concepts such as pumped limiter and RF current drive have been carried out. Key design parameters are; fusion power of 440 MW, average neutron wall loading of 1MW/m 2 , major radius of 5.5m, plasma minor radius of 1.1m, plasma elongation of 1.5, plasma current of 5.3MA, toroidal beta of 4%, toroidal field on plasma axis of 5.7T and tritium breeding ratio of above unity

  16. Application of a statistical design to the optimization of parameters and culture medium for alpha-amylase production by Aspergillus oryzae CBS 819.72 grown on gruel (wheat grinding by-product).

    Science.gov (United States)

    Kammoun, Radhouane; Naili, Belgacem; Bejar, Samir

    2008-09-01

    The production optimization of alpha-amylase (E.C.3.2.1.1) from Aspergillus oryzae CBS 819.72 fungus, using a by-product of wheat grinding (gruel) as sole carbon source, was performed with statistical methodology based on three experimental designs. The optimisation of temperature, agitation and inoculum size was attempted using a Box-Behnken design under the response surface methodology. The screening of nineteen nutrients for their influence on alpha-amylase production was achieved using a Plackett-Burman design. KH(2)PO(4), urea, glycerol, (NH(4))(2)SO(4), CoCl(2), casein hydrolysate, soybean meal hydrolysate, MgSO(4) were selected based on their positive influence on enzyme formation. The optimized nutrients concentration was obtained using a Taguchi experimental design and the analysis of the data predicts a theoretical increase in the alpha-amylase expression of 73.2% (from 40.1 to 151.1 U/ml). These conditions were validated experimentally and revealed an enhanced alpha-amylase yield of 72.7%.

  17. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  18. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  19. An Experimental Verification of morphology of ibuprofen crystals from CAMD designed solvent

    DEFF Research Database (Denmark)

    Karunanithi, Arunprakash T.; Acquah, Charles; Achenie, Luke E.K.

    2007-01-01

    of crystals formed from solvents, necessitates additional experimental verification steps. In this work we report the experimental verification of crystal morphology for the case study, solvent design for ibuprofen crystallization, presented in Karunanithi et al. [2006. A computer-aided molecular design...

  20. Selection of a design for response surface

    Science.gov (United States)

    Ranade, Shruti Sunil; Thiagarajan, Padma

    2017-11-01

    Box-Behnken, Central-Composite, D and I-optimal designs were compared using statistical tools. Experimental trials for all designs were generated. Random uniform responses were simulated for all models. R-square, Akaike and Bayesian Information Criterion for the fitted models were noted. One-way ANOVA and Tukey’s multiple comparison test were performed on these parameters. These models were evaluated based on the number of experimental trials generated in addition to the results of the statistical analyses. D-optimal design generated 12 trials in its model, which was lesser in comparison to both Central Composite and Box-Behnken designs. The R-square values of the fitted models were found to possess a statistically significant difference (P<0.0001). D-optimal design not only had the highest mean R-square value (0.7231), but also possessed the lowest means for both Akaike and Bayesian Information Criterion. The D-optimal design was recommended for generation of response surfaces, based on the assessment of the above parameters.

  1. Design and experimental characterization of an EM pump

    International Nuclear Information System (INIS)

    Kim, Hee Reyoung; Hong, Sang Hee

    1999-01-01

    Generally, an EM (electromagnetic) pump is been employed to circulate electrically conducting liquids by using the Lorentz force. Especially, at the liquid metal reactor (LMR), which uses liquid sodium with high electrical conductivity as a coolant, an EM pump is needed due to its advantages over a mechanical pump, such as no rotating parts, no noise, and simplicity. In this research, an EM pump of a pilot annular linear induction type with a flow rate of 200 l/min was designed by using the electrical equivalent-circuit method. The pump was designed and manufactured by considering material and environmental (high temperature and liquid sodium) requirements. The pump performance was experimentally characterized based on input currents, voltage, power, and frequency. Also, the theoretical prediction was compared with the experimental result

  2. Conceptual design of fusion experimental reactor (FER)

    International Nuclear Information System (INIS)

    1984-02-01

    This report describes the engineering conceptual design of Fusion Experimental Reactor (FER) which is to be built as a next generation tokamak machine. This design covers overall reactor systems including MHD equilibrium analysis, mechanical configuration of reactor, divertor, pumped limiter, first wall/breeding blanket/shield, toroidal field magnet, poloidal field magnet, cryostat, electromagnetic analysis, vacuum system, power handling and conversion, NBI, RF heating device, tritium system, neutronics, maintenance, cooling system and layout of facilities. The engineering comparison of a divertor with pumped limiters and safety analysis of reactor systems are also conducted. (author)

  3. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  4. Fast Bayesian optimal experimental design and its applications

    KAUST Repository

    Long, Quan

    2015-01-01

    We summarize our Laplace method and multilevel method of accelerating the computation of the expected information gain in a Bayesian Optimal Experimental Design (OED). Laplace method is a widely-used method to approximate an integration

  5. Evaluated experimental database on critical heat flux in WWER FA models

    International Nuclear Information System (INIS)

    Artamonov, S.; Sergeev, V.; Volkov, S.

    2015-01-01

    The paper presents the description of the evaluated experimental database on critical heat flux in WWER FA models of new designs. This database was developed on the basis of the experimental data obtained in the years of 2009-2012. In the course of its development, the database was reviewed in terms of completeness of the information about the experiments and its compliance with the requirements of Rostekhnadzor regulatory documents. The description of the experimental FA model characteristics and experimental conditions was specified. Besides, the experimental data were statistically processed with the aim to reject incorrect ones and the sets of experimental data on critical heat fluxes (CHF) were compared for different FA models. As a result, for the fi rst time, the evaluated database on CHF in FA models of new designs was developed, that was complemented with analysis functions, and its main purpose is to be used in the process of development, verification and upgrading of calculation techniques. The developed database incorporates the data of 4183 experimental conditions obtained in 53 WWER FA models of various designs. Keywords: WWER reactor, fuel assembly, CHF, evaluated experimental data, database, statistical analysis. (author)

  6. Applied statistics : an important phase in the development of experimental science (Inaugural lecture)

    NARCIS (Netherlands)

    Hamaker, H.C.

    1962-01-01

    In many fields of inquiry, especially those concerned with living beings, "exact" observations are not possible and it is necessary to investigate the effect of several factors at the same time. This has led to the design of experiments on a statistical basis, in which several factors may be varied

  7. Optimal Experimental Design of Furan Shock Tube Kinetic Experiments

    KAUST Repository

    Kim, Daesang

    2015-01-07

    A Bayesian optimal experimental design methodology has been developed and applied to refine the rate coefficients of elementary reactions in Furan combustion. Furans are considered as potential renewable fuels. We focus on the Arrhenius rates of Furan + OH ↔ Furyl-2 + H2O and Furan ↔ OH Furyl-3 + H2O, and rely on the OH consumption rate as experimental observable. A polynomial chaos surrogate is first constructed using an adaptive pseudo-spectral projection algorithm. The PC surrogate is then exploited in conjunction with a fast estimation of the expected information gain in order to determine the optimal design in the space of initial temperatures and OH concentrations.

  8. Experimental design and analysis for piezoelectric circular actuators in flow control applications

    International Nuclear Information System (INIS)

    Mane, Poorna; Mossi, Karla; Bryant, Robert

    2008-01-01

    Flow control can lead to saving millions of dollars in fuel costs each year by making an aircraft more efficient. Synthetic jets, a device for active flow control, operate by introducing small amounts of energy locally to achieve non-local changes in the flow field with large performance gains. These devices consist of a cavity with an oscillating diaphragm that divides it into active and passive sides. The active side has a small opening where a jet is formed, while the passive side does not directly participate in the fluidic jet. Over the years, research has shown that synthetic jet behavior is dependent on the active diaphragm and the cavity design; hence, the focus of this work. The performance of the synthetic jet is studied under various factors related to the diaphragm and the cavity geometry. Three diaphragms, manufactured from piezoelectric composites, were selected for this study: Bimorph, Thunder ® and Lipca. The overall factors considered are the driving signals, voltage, frequency, cavity height, orifice size, and passive cavity pressure. Using the average maximum jet velocity as the response variable, these factors are individually studied for each actuator, and statistical analysis tools are used to select the relevant factors in the response variable. The factors are divided into two experimental fractional factorial design matrices, with five and four factors, respectively. Both experiments are chosen to be of resolution V, where main factors are confounded with three-factor interactions. In the first experimental design, the results show that frequency is not a significant factor, while waveform is significant for all the actuators. In addition, the magnitude of the regression coefficients suggests that a model that includes the diaphragm as a factor may be possible. These results are valid within the ranges tested, that is low frequencies and sawtooth and sine waveform as driving signals. In the second experimental design, cavity dimensions are

  9. Experimental design, modeling and optimization of polyplex formation between DNA oligonucleotides and branched polyethylenimine.

    Science.gov (United States)

    Clima, Lilia; Ursu, Elena L; Cojocaru, Corneliu; Rotaru, Alexandru; Barboiu, Mihail; Pinteala, Mariana

    2015-09-28

    The complexes formed by DNA and polycations have received great attention owing to their potential application in gene therapy. In this study, the binding efficiency between double-stranded oligonucleotides (dsDNA) and branched polyethylenimine (B-PEI) has been quantified by processing of the images captured from the gel electrophoresis assays. The central composite experimental design has been employed to investigate the effects of controllable factors on the binding efficiency. On the basis of experimental data and the response surface methodology, a multivariate regression model has been constructed and statistically validated. The model has enabled us to predict the binding efficiency depending on experimental factors, such as concentrations of dsDNA and B-PEI as well as the initial pH of solution. The optimization of the binding process has been performed using simplex and gradient methods. The optimal conditions determined for polyplex formation have yielded a maximal binding efficiency close to 100%. In order to reveal the mechanism of complex formation at the atomic-scale, a molecular dynamic simulation has been carried out. According to the computation results, B-PEI amine hydrogen atoms have interacted with oxygen atoms from dsDNA phosphate groups. These interactions have led to the formation of hydrogen bonds between macromolecules, stabilizing the polyplex structure.

  10. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  11. An Evaluation of the Use of Statistical Procedures in Soil Science

    Directory of Open Access Journals (Sweden)

    Laene de Fátima Tavares

    2016-01-01

    Full Text Available ABSTRACT Experimental statistical procedures used in almost all scientific papers are fundamental for clearer interpretation of the results of experiments conducted in agrarian sciences. However, incorrect use of these procedures can lead the researcher to incorrect or incomplete conclusions. Therefore, the aim of this study was to evaluate the characteristics of the experiments and quality of the use of statistical procedures in soil science in order to promote better use of statistical procedures. For that purpose, 200 articles, published between 2010 and 2014, involving only experimentation and studies by sampling in the soil areas of fertility, chemistry, physics, biology, use and management were randomly selected. A questionnaire containing 28 questions was used to assess the characteristics of the experiments, the statistical procedures used, and the quality of selection and use of these procedures. Most of the articles evaluated presented data from studies conducted under field conditions and 27 % of all papers involved studies by sampling. Most studies did not mention testing to verify normality and homoscedasticity, and most used the Tukey test for mean comparisons. Among studies with a factorial structure of the treatments, many had ignored this structure, and data were compared assuming the absence of factorial structure, or the decomposition of interaction was performed without showing or mentioning the significance of the interaction. Almost none of the papers that had split-block factorial designs considered the factorial structure, or they considered it as a split-plot design. Among the articles that performed regression analysis, only a few of them tested non-polynomial fit models, and none reported verification of the lack of fit in the regressions. The articles evaluated thus reflected poor generalization and, in some cases, wrong generalization in experimental design and selection of procedures for statistical analysis.

  12. Statistical Engineering in Air Traffic Management Research

    Science.gov (United States)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  13. Intuitive web-based experimental design for high-throughput biomedical data.

    Science.gov (United States)

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  14. Exopolysaccharide production from Bacillus velezensis KY471306 using statistical experimental design.

    Science.gov (United States)

    Moghannem, Saad A M; Farag, Mohamed M S; Shehab, Amr M; Azab, Mohamed S

    2018-01-18

    Exopolysaccharide (EPS) biopolymers produced by microorganisms play a crucial role in the environment such as health and bio-nanotechnology sectors, gelling agents in food and cosmetic industries in addition to bio-flocculants in the environmental sector as they are degradable, nontoxic. This study focuses on the improvement of EPS production through manipulation of different culture and environmental conditions using response surface methodology (RSM). Plackett-Burman design indicated that; molasses, yeast extract and incubation temperature are the most effective parameters. Box-Behnken RSM indicated that; the optimum concentration for each parameter was 12% (w/v) for molasses, 6g/L yeast extract and 30°C for incubation temperature. The most potent bacterial isolate was identified as Bacillus velezensis KY498625. After production, EPS was extracted, purified using DEAE-cellulose, identified using Fourier transform infrared (FTIR), gel permeation chromatography (GPC) and gas chromatography-mass spectroscopy (GC-MS). The result indicated that; it has molecular weight 1.14×10 5 D consisting of glucose, mannose and galactose. Copyright © 2018 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.

  15. Cellular internalisation kinetics and cytotoxic properties of statistically designed and optimised neo-geometric copper nanocrystals.

    Science.gov (United States)

    Murugan, Karmani; Choonara, Yahya E; Kumar, Pradeep; du Toit, Lisa C; Pillay, Viness

    2017-09-01

    This study aimed to highlight a statistic design to precisely engineer homogenous geometric copper nanoparticles (CuNPs) for enhanced intracellular drug delivery as a function of geometrical structure. CuNPs with a dual functionality comprising geometric attributes for enhanced cell uptake and exerting cytotoxic activity on proliferating cells were synthesized as a novel drug delivery system. This paper investigated the defined concentrations of two key surfactants used in the reaction to mutually control and manipulate nano-shape and optimisation of the geometric nanosystems. A statistical experimental design comprising a full factorial model served as a refining factor to achieve homogenous geometric nanoparticles using a one-pot method for the systematic optimisation of the geometric CuNPs. Shapes of the nanoparticles were investigated to determine the result of the surfactant variation as the aim of the study and zeta potential was studied to ensure the stability of the system and establish a nanosystem of low aggregation potential. After optimisation of the nano-shapes, extensive cellular internalisation studies were conducted to elucidate the effect of geometric CuNPs on uptake rates, in addition to the vital toxicity assays to further understand the cellular effect of geometric CuNPs as a drug delivery system. In addition to geometry; volume, surface area, orientation to the cell membrane and colloidal stability is also addressed. The outcomes of the study demonstrated the success of homogenous geometric NP formation, in addition to a stable surface charge. The findings of the study can be utilized for the development of a drug delivery system for promoted cellular internalisation and effective drug delivery. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. The statistics of dose/cure relationships for irradiated tumours

    International Nuclear Information System (INIS)

    Porter, E.H.

    1980-01-01

    Attention is given to the statistical analysis of dose/cure experiments. The simplest possible theory is developed in detail with special attention to experimental design and to the range of validity of the methods advocated. Explanations are aimed at the mathematics-tolerant, not at the mathematician. (author)

  17. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  18. Mathematics and Statistics Research Department progress report for period ending June 30, 1975

    International Nuclear Information System (INIS)

    Coveyou, R.R.; Gosslee, D.G.; Wilson, D.G.

    1975-10-01

    Brief reports on mathematical and statistical research and consulting and collaboration are given for the following areas: statistical estimation, statistical testing, experimental design, probability, energy systems modeling, continuum mechanics, matrices and other operators, numerical analysis, biomathematics and biostatistics, analytical chemistry, biology and medicine, health physics research, management, materials research, physics research, and programming. Information on seminars, publications, etc., is also included. (10 figures, 4 tables)

  19. Experimental burn plot trial in the Kruger National Park: history, experimental design and suggestions for data analysis

    Directory of Open Access Journals (Sweden)

    R. Biggs

    2003-12-01

    Full Text Available The experimental burn plot (EBP trial initiated in 1954 is one of few ongoing long-termfire ecology research projects in Africa. The trial aims to assess the impacts of differentfire regimes in the Kruger National Park. Recent studies on the EBPs have raised questions as to the experimental design of the trial, and the appropriate model specificationwhen analysing data. Archival documentation reveals that the original design was modified on several occasions, related to changes in the park's fire policy. These modifications include the addition of extra plots, subdivision of plots and changes in treatmentsover time, and have resulted in a design which is only partially randomised. The representativity of the trial plots has been questioned on account of their relatively small size,the concentration of herbivores on especially the frequently burnt plots, and soil variation between plots. It is suggested that these factors be included as covariates inexplanatory models or that certain plots be excluded from data analysis based on resultsof independent studies of these factors. Suggestions are provided for the specificationof the experimental design when analysing data using Analysis of Variance. It is concluded that there is no practical alternative to treating the trial as a fully randomisedcomplete block design.

  20. Design study of the experimental multi-purpose high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Tsunoda, Ryokichi

    1981-01-01

    In this paper, the design study carried out since 1973 is outlined. The basic conceptual design was performed in fiscal 1973. In this design, concept was established on the total system of the experimental high temperature gas-cooled reactor including heat-utilizing system. The first conceptual design was carried out in fiscal 1974. The range of design was limited to the experimental reactor and its direct heat-removing system. The part 2 of the first conceptual design was performed in fiscal 1975, and the system design concerning the plant characteristics was made. The part 1 of the adjustment design was carried out in fiscal 1976, and the subject was the adjustment design of plant systems. The part 2 was performed in fiscal 1977, and the characteristics of plant control system were analyzed. In fiscal 1978, the analysis of flow characteristics in the core was made. The integrated system design was carried out in fiscal 1979, and the design of the total plant system except heat-utilizing system was started again. The part 1 of the detailed design was performed in fiscal 1980, and in addition, the possibility of increasing power output was examined. The construction cost of the experimental reactor plant estimated in 1979 was far higher than that in 1973. (Kako, I.)

  1. Optimal Experimental Design for Large-Scale Bayesian Inverse Problems

    KAUST Repository

    Ghattas, Omar

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  2. Study of Formulation Variables Influencing Polymeric Microparticles by Experimental Design

    Directory of Open Access Journals (Sweden)

    Jitendra B. Naik

    2014-04-01

    Full Text Available The objective of this study was to prepare diclofenac sodium loaded microparticles by single emulsion [oil-in-water (o/w] solvent evaporation method. The 22 experimental design methodology was used to evaluate the effect of two formulation variables on microspheres properties using the Design-Expert® software and evaluated for their particle size, morphology, and encapsulation efficiency and in vitro drug release. The graphical and mathematical analysis of the design showed that the independent variables were a significant effect on the encapsulation efficiency and drug release of microparticles. The low magnitudes of error and significant values of R2 prove the high prognostic ability of the design. The microspheres showed high encapsulation efficiency with an increase in the amount of polymer and decrease in the amount of PVA in the formulation. The particles were found to be spherical with smooth surface. Prolonged drug release and enhancement of encapsulation efficiency of polymeric microparticles can be successfully obtained with an application of experimental design technique.

  3. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  4. Conceptual design of fusion experimental reactor (FER)

    International Nuclear Information System (INIS)

    1985-01-01

    The Fusion Experimental Reactor (FER) being developed at JAERI as a next generation tokamak to JT-60 has a major mission of realizing a self-ignited long-burning DT plasma and demonstrating engineering feasibility. During FY82 and FY83 a comprehensive and intensive conceptual design study has been conducted for a pulsed operation FER as a reference option which employs a conventional inductive current drive and a double-null divertor. In parallel with the reference design, studies have been carried out to evaluate advanced reactor concepts such as quasi-steady state operation and steady state operation based on RF current drive and pumped limiter, and comparative studies for single-null divertor/pumped limiter. This report presents major results obtained primarily from FY83 design studies, while the results of FY82 design studies are described in previous references (JAERI-M 83-213--216). (author)

  5. Experimental Design: Utilizing Microsoft Mathematics in Teaching and Learning Calculus

    Science.gov (United States)

    Oktaviyanthi, Rina; Supriani, Yani

    2015-01-01

    The experimental design was conducted to investigate the use of Microsoft Mathematics, free software made by Microsoft Corporation, in teaching and learning Calculus. This paper reports results from experimental study details on implementation of Microsoft Mathematics in Calculus, students' achievement and the effects of the use of Microsoft…

  6. Methods for the neutronic design of a Supersara experimental loop

    International Nuclear Information System (INIS)

    Casali, F.; Cepraga, D.

    1982-01-01

    This paper describes a method for the neutronic design of experimental loops irradiated in D 2 O experimental reactors, like Essor. The calculation approach concerns the definition of a Weigner-Seitz cell where the loop under examination be subjected to the same neutronic conditions as in the actual reactor

  7. Statistics of Stacked Strata on Experimental Shelf Margins

    Science.gov (United States)

    Fernandes, A. M.; Straub, K. M.

    2015-12-01

    Continental margin deposits provide the most complete record on Earth of paleo-landscapes, but these records are complex and difficult to interpret. To a seismic geomorphologist or stratigrapher, mapped surfaces often present a static diachronous record of these landscapes through time. We present data that capture the dynamics of experimental shelf-margin landscapes at high-temporal resolution and define internal hierarchies within stacked channelized and weakly channelized deposits from the shelf to the slope. Motivated by observations from acoustically-imaged continental margins offshore Brunei and in the Gulf of Mexico, we use physical experiments to quantify stratal patterns of sub-aqueous slope channels and lobes that are linked to delta-top channels. The data presented here are from an experiment that was run for 26 hours of experimental run time. Overhead photographs and topographic scans captured flow dynamics and surface aggradation/degradation every ten minutes. Currents rich in sediment built a delta that prograded to the shelf-edge. These currents were designed to plunge at the shoreline and travel as turbidity currents beyond the delta and onto the continental slope. Pseudo-subsidence was imposed by a slight base-level rise that generated accommodation space and promoted the construction of stratigraphy on the delta-top. Compensational stacking is a term that is frequently applied to deposits that tend to fill in topographic lows in channelized and weakly channelized systems. The compensation index, a metric used to quantify the strength of compensation, is used here to characterize deposits at different temporal scales on the experimental landscape. The compensation timescale is the characteristic time at which the accumulated deposits begins to match the shape of basin-wide subsidence rates (uniform for these experiments). We will use the compensation indices along strike transects across the delta, proximal slope and distal slope to evaluate the

  8. Set membership experimental design for biological systems

    Directory of Open Access Journals (Sweden)

    Marvel Skylar W

    2012-03-01

    Full Text Available Abstract Background Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. Results In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. Conclusions The practicability of our approach is illustrated with a case study. This

  9. Achieving optimal SERS through enhanced experimental design.

    Science.gov (United States)

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J; Goodacre, Royston

    2016-01-01

    One of the current limitations surrounding surface-enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal-based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd.

  10. Development and Validation of a Rubric for Diagnosing Students' Experimental Design Knowledge and Difficulties.

    Science.gov (United States)

    Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  11. Experimental Methods for the Analysis of Optimization Algorithms

    DEFF Research Database (Denmark)

    , computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different...... in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies. This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment......In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However...

  12. The Impact of Statistical Leakage Models on Design Yield Estimation

    Directory of Open Access Journals (Sweden)

    Rouwaida Kanj

    2011-01-01

    Full Text Available Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling.

  13. Conceptual design of fusion experimental reactor (FER)

    International Nuclear Information System (INIS)

    1984-01-01

    Conceptual Design of Fusion Experimental Reactor (FER) of which the objective will be to realize self-ignition with D-T reaction is reported. Mechanical Configurations of FER are characterized with a noncircular plasma and a double-null divertor. The primary aim of design studies is to demonstrate fissibility of reactor structures as compact and simple as possible with removable torus sectors. The structures of each component such as a first-wall, blanket, shielding, divertor, magnet and so on have been designed. It is also discussed about essential reactor plant system requirements. In addition to the above, a brief concept of a steady-state reactor based on RF current drive is also discussed. The main aim, in this time, is to examine physical studies of a possible RF steady-state reactor. (author)

  14. Experimental validation of a new heterogeneous mechanical test design

    Science.gov (United States)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    Standard material parameters identification strategies generally use an extensive number of classical tests for collecting the required experimental data. However, a great effort has been made recently by the scientific and industrial communities to support this experimental database on heterogeneous tests. These tests can provide richer information on the material behavior allowing the identification of a more complete set of material parameters. This is a result of the recent development of full-field measurements techniques, like digital image correlation (DIC), that can capture the heterogeneous deformation fields on the specimen surface during the test. Recently, new specimen geometries were designed to enhance the richness of the strain field and capture supplementary strain states. The butterfly specimen is an example of these new geometries, designed through a numerical optimization procedure where an indicator capable of evaluating the heterogeneity and the richness of strain information. However, no experimental validation was yet performed. The aim of this work is to experimentally validate the heterogeneous butterfly mechanical test in the parameter identification framework. For this aim, DIC technique and a Finite Element Model Up-date inverse strategy are used together for the parameter identification of a DC04 steel, as well as the calculation of the indicator. The experimental tests are carried out in a universal testing machine with the ARAMIS measuring system to provide the strain states on the specimen surface. The identification strategy is accomplished with the data obtained from the experimental tests and the results are compared to a reference numerical solution.

  15. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  16. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki; Long, Quan; Scavino, Marco; Tempone, Raul

    2015-01-01

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  17. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki

    2015-01-07

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  18. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  19. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  20. Statistical controversies in clinical research: requiem for the 3 + 3 design for phase I trials.

    Science.gov (United States)

    Paoletti, X; Ezzalfani, M; Le Tourneau, C

    2015-09-01

    More than 95% of published phase I trials have used the 3 + 3 design to identify the dose to be recommended for phase II trials. However, the statistical community agrees on the limitations of the 3 + 3 design compared with model-based approaches. Moreover, the mechanisms of action of targeted agents strongly challenge the hypothesis that the maximum tolerated dose constitutes the optimal dose, and more outcomes including clinical and biological activity increasingly need to be taken into account to identify the optimal dose. We review key elements from clinical publications and from the statistical literature to show that the 3 + 3 design lacks the necessary flexibility to address the challenges of targeted agents. The design issues raised by expansion cohorts, new definitions of dose-limiting toxicity and trials of combinations are not easily addressed by the 3 + 3 design or its extensions. Alternative statistical proposals have been developed to make a better use of the complex data generated by phase I trials. Their applications require a close collaboration between all actors of early phase clinical trials. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  1. Which statistics should tropical biologists learn?

    Science.gov (United States)

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  2. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  3. MYRRHA/XT-ADS primary system design and experimental devices

    International Nuclear Information System (INIS)

    Maes, D.

    2009-01-01

    The EUROTRANS project is an integrated project in the Sixth European Framework Program in the context of Partitioning and Transmutation. The objective of this project is to work towards an ETD (European Transmutation Demonstration) in a step-wise manner. The first step is to carry out an advanced design of a small-scale XT-ADS (eXperimental Transmutation in an Accelerator Driven System) for realisation in a short-term (about 10 years) as well as to accomplish a generic conceptual design of EFIT (European Facility for Industrial Transmutation) for realisation in the long-term. The MYRRHA-2005 design served as a starting basis for the XT-ADS. Many options have been revisited and the framework is now set up. While the MYRRHA-2005 design was still a conceptual design, the intention is to get at the end of the EUROTRANS project (March 2009) an advanced design of the XT-ADS, albeit a first advanced design. While the design work performed during the first years of the project (2005-2006) was mainly devoted to optimise and enhance the primary and secondary system configuration according to the suggestions and contributions of our industrial partners (Ansaldo Nucleare, Areva, Suez-Tractebel) within the DM1 (Domain 1 D ESIGN ) , the last year work objectives mainly consisted of (1) the release of the Remote Handling Design Catalogue for XT-ADS and (2) the formulation of the specification of the experimental devices according to the XT-ADS objectives and adapted to the actual XT-ADS core and core support structure design; (3) the detailed calculations of the main XT-ADS primary and secondary system components

  4. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these

  5. Design, construction and testing of a radon experimental chamber

    International Nuclear Information System (INIS)

    Chavez B, A.; Balcazar G, M.

    1991-10-01

    To carry out studies on the radon behavior under controlled and stable conditions it was designed and constructed a system that consists of two parts: a container of mineral rich in Uranium and an experimentation chamber with radon united one to the other one by a step valve. The container of uranium mineral approximately contains 800 gr of uranium with a law of 0.28%; the radon gas emanated by the mineral is contained tightly by the container. When the valve opens up the radon gas it spreads to the radon experimental chamber; this contains 3 accesses that allow to install different types of detectors. The versatility of the system is exemplified with two experiments: 1. With the radon experimental chamber and an associated spectroscopic system, the radon and two of its decay products are identified. 2. The design of the system allows to couple the mineral container to other experimental geometries to demonstrate this fact it was coupled and proved a new automatic exchanger system of passive detectors of radon. The results of the new automatic exchanger system when it leave to flow the radon freely among the container and the automatic exchanger through a plastic membrane of 15 m. are shown. (Author)

  6. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  7. Applied statistical methods in agriculture, health and life sciences

    CERN Document Server

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  8. Evaluation of undergraduate nursing students' attitudes towards statistics courses, before and after a course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Olu; Kellett, Peter; Dei, Samuel Ofori

    2013-09-01

    Undergraduate nursing students must often take a course in statistics, yet there is scant research to inform teaching pedagogy. The objectives of this study were to assess nursing students' overall attitudes towards statistics courses - including (among other things) overall fear and anxiety, preferred learning and teaching styles, and the perceived utility and benefit of taking a statistics course - before and after taking a mandatory course in applied statistics. The authors used a pre-experimental research design (a one-group pre-test/post-test research design), by administering a survey to nursing students at the beginning and end of the course. The study was conducted at a University in Western Canada that offers an undergraduate Bachelor of Nursing degree. Participants included 104 nursing students, in the third year of a four-year nursing program, taking a course in statistics. Although students only reported moderate anxiety towards statistics, student anxiety about statistics had dropped by approximately 40% by the end of the course. Students also reported a considerable and positive change in their attitudes towards learning in groups by the end of the course, a potential reflection of the team-based learning that was used. Students identified preferred learning and teaching approaches, including the use of real-life examples, visual teaching aids, clear explanations, timely feedback, and a well-paced course. Students also identified preferred instructor characteristics, such as patience, approachability, in-depth knowledge of statistics, and a sense of humor. Unfortunately, students only indicated moderate agreement with the idea that statistics would be useful and relevant to their careers, even by the end of the course. Our findings validate anecdotal reports on statistics teaching pedagogy, although more research is clearly needed, particularly on how to increase students' perceptions of the benefit and utility of statistics courses for their nursing

  9. Solar-cell interconnect design for terrestrial photovoltaic modules

    Science.gov (United States)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  10. Directions for new developments on statistical design and analysis of small population group trials.

    Science.gov (United States)

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small

  11. Review of research designs and statistical methods employed in dental postgraduate dissertations.

    Science.gov (United States)

    Shirahatti, Ravi V; Hegde-Shetiya, Sahana

    2015-01-01

    There is a need to evaluate the quality of postgraduate dissertations of dentistry submitted to university in the light of the international standards of reporting. We conducted the review with an objective to document the use of sampling methods, measurement standardization, blinding, methods to eliminate bias, appropriate use of statistical tests, appropriate use of data presentation in postgraduate dental research and suggest and recommend modifications. The public access database of the dissertations from Rajiv Gandhi University of Health Sciences was reviewed. Three hundred and thirty-three eligible dissertations underwent preliminary evaluation followed by detailed evaluation of 10% of randomly selected dissertations. The dissertations were assessed based on international reporting guidelines such as strengthening the reporting of observational studies in epidemiology (STROBE), consolidated standards of reporting trials (CONSORT), and other scholarly resources. The data were compiled using MS Excel and SPSS 10.0. Numbers and percentages were used for describing the data. The "in vitro" studies were the most common type of research (39%), followed by observational (32%) and experimental studies (29%). The disciplines conservative dentistry (92%) and prosthodontics (75%) reported high numbers of in vitro research. Disciplines oral surgery (80%) and periodontics (67%) had conducted experimental studies as a major share of their research. Lacunae in the studies included observational studies not following random sampling (70%), experimental studies not following random allocation (75%), not mentioning about blinding, confounding variables and calibrations in measurements, misrepresenting the data by inappropriate data presentation, errors in reporting probability values and not reporting confidence intervals. Few studies showed grossly inappropriate choice of statistical tests and many studies needed additional tests. Overall observations indicated the need to

  12. Optimizing laboratory animal stress paradigms: The H-H* experimental design.

    Science.gov (United States)

    McCarty, Richard

    2017-01-01

    Major advances in behavioral neuroscience have been facilitated by the development of consistent and highly reproducible experimental paradigms that have been widely adopted. In contrast, many different experimental approaches have been employed to expose laboratory mice and rats to acute versus chronic intermittent stress. An argument is advanced in this review that more consistent approaches to the design of chronic intermittent stress experiments would provide greater reproducibility of results across laboratories and greater reliability relating to various neural, endocrine, immune, genetic, and behavioral adaptations. As an example, the H-H* experimental design incorporates control, homotypic (H), and heterotypic (H*) groups and allows for comparisons across groups, where each animal is exposed to the same stressor, but that stressor has vastly different biological and behavioral effects depending upon each animal's prior stress history. Implementation of the H-H* experimental paradigm makes possible a delineation of transcriptional changes and neural, endocrine, and immune pathways that are activated in precisely defined stressor contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2017-03-01

    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  14. The Effects of Design Strength, Fly Ash Content and Curing Method on Compressive Strength of High Volume Fly Ash Concrete: A Design of Experimental

    Directory of Open Access Journals (Sweden)

    Solikin Mochamad

    2017-01-01

    Full Text Available High volume fly ash concrete becomes one of alternatives to produce green concrete as it uses waste material and significantly reduces the utilization of Portland cement in concrete production. Although using less cement, its compressive strength is comparable to ordinary Portland cement (hereafter OPC and the its durability increases significantly. This paper reports investigation on the effect of design strength, fly ash content and curing method on compressive strength of High Volume Fly Ash Concrete. The experiment and data analysis were prepared using minitab, a statistic software for design of experimental. The specimens were concrete cylinder with diameter of 15 cm and height of 30 cm, tested for its compressive strength at 56 days. The result of the research demonstrates that high volume fly ash concrete can produce comparable compressive strength which meets the strength of OPC design strength especially for high strength concrete. In addition, the best mix proportion to achieve the design strength is the combination of high strength concrete and 50% content of fly ash. Moreover, the use of spraying method for curing method of concrete on site is still recommended as it would not significantly reduce the compressive strength result.

  15. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  16. SCRAED - Simple and Complex Random Assignment in Experimental Designs

    OpenAIRE

    Alferes, Valentim R.

    2009-01-01

    SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...

  17. CFAssay: statistical analysis of the colony formation assay

    International Nuclear Information System (INIS)

    Braselmann, Herbert; Michna, Agata; Heß, Julia; Unger, Kristian

    2015-01-01

    Colony formation assay is the gold standard to determine cell reproductive death after treatment with ionizing radiation, applied for different cell lines or in combination with other treatment modalities. Associated linear-quadratic cell survival curves can be calculated with different methods. For easy code exchange and methodological standardisation among collaborating laboratories a software package CFAssay for R (R Core Team, R: A Language and Environment for Statistical Computing, 2014) was established to perform thorough statistical analysis of linear-quadratic cell survival curves after treatment with ionizing radiation and of two-way designs of experiments with chemical treatments only. CFAssay offers maximum likelihood and related methods by default and the least squares or weighted least squares method can be optionally chosen. A test for comparision of cell survival curves and an ANOVA test for experimental two-way designs are provided. For the two presented examples estimated parameters do not differ much between maximum-likelihood and least squares. However the dispersion parameter of the quasi-likelihood method is much more sensitive for statistical variation in the data than the multiple R 2 coefficient of determination from the least squares method. The dispersion parameter for goodness of fit and different plot functions in CFAssay help to evaluate experimental data quality. As open source software interlaboratory code sharing between users is facilitated

  18. Providing guidance in virtual lab experimentation : the case of an experiment design tool

    NARCIS (Netherlands)

    Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; de Jong, Ton A.J.M.; Anjewierden, Anjo; van Riesen, Siswa A.N.

    2018-01-01

    The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students’

  19. An Empirical Study of Parameter Estimation for Stated Preference Experimental Design

    Directory of Open Access Journals (Sweden)

    Fei Yang

    2014-01-01

    Full Text Available The stated preference experimental design can affect the reliability of the parameters estimation in discrete choice model. Some scholars have proposed some new experimental designs, such as D-efficient, Bayesian D-efficient. But insufficient empirical research has been conducted on the effectiveness of these new designs and there has been little comparative analysis of the new designs against the traditional designs. In this paper, a new metro connecting Chengdu and its satellite cities is taken as the research subject to demonstrate the validity of the D-efficient and Bayesian D-efficient design. Comparisons between these new designs and orthogonal design were made by the fit of model and standard deviation of parameters estimation; then the best model result is obtained to analyze the travel choice behavior. The results indicate that Bayesian D-efficient design works better than D-efficient design. Some of the variables can affect significantly the choice behavior of people, including the waiting time and arrival time. The D-efficient and Bayesian D-efficient design for MNL can acquire reliability result in ML model, but the ML model cannot develop the theory advantages of these two designs. Finally, the metro can handle over 40% passengers flow if the metro will be operated in the future.

  20. Comparison of Tsallis statistics with the Tsallis-factorized statistics in the ultrarelativistic pp collisions

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2016-01-01

    The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)

  1. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    Science.gov (United States)

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  2. Formulation and optimization of chronomodulated press-coated tablet of carvedilol by Box–Behnken statistical design

    Directory of Open Access Journals (Sweden)

    Satwara RS

    2012-08-01

    Full Text Available Rohan S Satwara, Parul K PatelDepartment of Pharmaceutics, Babaria Institute of Pharmacy, Vadodara, Gujarat, IndiaObjective: The primary objective of the present investigation was to formulate and optimize chronomodulated press-coated tablets to deliver the antihypertensive carvedilol at an effective quantity predawn, when a blood pressure spike is typically observed in most hypertensive patients.Experimental work: Preformulation studies and drug excipient compatibility studies were carried out for carvedilol and excipients. Core tablets (6 mm containing carvedilol and 10-mm press-coated tablets were prepared by direct compression. The Box–Behnken experimental design was applied to these press-coated tablets (F1–F15 formula with differing concentrations of rate-controlling polymers. Hydroxypropyl methyl cellulose K4M, ethyl cellulose, and K-carrageenan were used as rate-controlling polymers in the outer layer. These tablets were subjected to various precompression and postcompression tests. The optimized batch was derived both by statistically (using desirability function and graphically (using Design Expert® 8; Stat-Ease Inc. Tablets formulated using the optimized formulas were then evaluated for lag time and in vitro dissolution.Results and discussion: Results of preformulation studies were satisfactory. No interaction was observed between carvedilol and excipients by ultraviolet, Fourier transform infrared spectroscopy, and dynamic light scattering analysis. The results of precompression studies and postcompression studies were within limits. The varying lag time and percent cumulative carvedilol release after 8 h was optimized to obtain a formulation that offered a release profile with 6 h lag time, followed by complete carvedilol release after 8 h. The results showed no significant bias between predicted response and actual response for the optimized formula.Conclusion: Bedtime dosing of chronomodulated press-coated tablets may offer a

  3. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  4. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  5. Scaffolded Instruction Improves Student Understanding of the Scientific Method & Experimental Design

    Science.gov (United States)

    D'Costa, Allison R.; Schlueter, Mark A.

    2013-01-01

    Implementation of a guided-inquiry lab in introductory biology classes, along with scaffolded instruction, improved students' understanding of the scientific method, their ability to design an experiment, and their identification of experimental variables. Pre- and postassessments from experimental versus control sections over three semesters…

  6. Optimal Experimental Design for Large-Scale Bayesian Inverse Problems

    KAUST Repository

    Ghattas, Omar

    2014-01-01

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation

  7. Designing Tasks to Examine Mathematical Knowledge for Teaching Statistics for Primary Teachers

    Science.gov (United States)

    Siswono, T. Y. E.; Kohar, A. W.; Hartono, S.

    2018-01-01

    Mathematical knowledge for teaching (MKT) is viewed as fuel resources for conducting an orchestra in a teaching and learning process. By understanding MKT, especially for primary teachers, it can predict the success of a goal of an instruction and analyze the weaknesses and improvements of it. To explore what teachers think about subject matters, pedagogical terms, and appropriate curriculum, it needs a task which can be identified the teachers’ MKT including the subject matter knowledge (SMK) and pedagogical content knowledge (PCK). This study aims to design an appropriate task for exploring primary teachers’ MKT for statistics in primary school. We designed six tasks to examine 40 primary teachers’ MKT, of which each respectively represents the categories of SMK (common content knowledge (CCK) and specialised content knowledge (SCK)) and PCK (knowledge of content and students (KCS), knowledge of content and teaching (KCT), and knowledge of content and curriculum (KCC)). While MKT has much attention of numbers of scholars, we consider knowledge of content and culture (KCCl) to be hypothesized in the domains of MKT. Thus, we added one more task examining how the primary teachers used their knowledge of content (KC) regarding to MKT in statistics. Some examples of the teachers’ responses on the tasks are discussed and some refinements of MKT task in statistics for primary teachers are suggested.

  8. Design of nuclear fuel cells by means of a statistical analysis and a sensibility study

    International Nuclear Information System (INIS)

    Jauregui C, V.; Castillo M, J. A.; Ortiz S, J. J.; Montes T, J. L.; Perusquia del C, R.

    2013-10-01

    This work contains the results of the statistical analysis realized to study the nuclear fuel cells performance, considering the frequencies for the election of fuel bars used in the design of the same ones. The election of the bars used for the cells design are of 3 types, the first election shows that to the plotting the respective frequency is similar to a normal distribution, in the second case the frequencies graph is of type inverted square X 2 and the last election is when the bars are chosen in aleatory form. The heuristic techniques used for the cells design were the neural networks, the ant colonies and a hybrid between the dispersed search and the trajectories re-linkage. To carry out the statistical analysis in the cells design were considered the local power peak factor and the neutron infinite multiplication factor (k∞) of this. On the other hand, the performance of the designed cells was analyzed when verifying the position of the bars containing gadolinium. The results show that is possible to design cells of nuclear fuel with a good performance, when considering the frequency of the bars used in their design. (Author)

  9. Computational design and experimental validation of new thermal barrier systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin [Louisiana State Univ., Baton Rouge, LA (United States)

    2015-03-31

    The focus of this project is on the development of a reliable and efficient ab initio based computational high temperature material design method which can be used to assist the Thermal Barrier Coating (TBC) bond-coat and top-coat design. Experimental evaluations on the new TBCs are conducted to confirm the new TBCs’ properties. Southern University is the subcontractor on this project with a focus on the computational simulation method development. We have performed ab initio density functional theory (DFT) method and molecular dynamics simulation on screening the top coats and bond coats for gas turbine thermal barrier coating design and validation applications. For experimental validations, our focus is on the hot corrosion performance of different TBC systems. For example, for one of the top coatings studied, we examined the thermal stability of TaZr2.75O8 and confirmed it’s hot corrosion performance.

  10. A statistical design for testing apomictic diversification through linkage analysis.

    Science.gov (United States)

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  11. Quasi-experimental designs in practice-based research settings: design and implementation considerations.

    Science.gov (United States)

    Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen

    2011-01-01

    Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.

  12. Optimum design of automobile seat using statistical design support system; Tokeiteki sekkei shien system no jidoshayo seat eno tekiyo

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwamura, T [NHK Spring Co. Ltd., Yokohama (Japan); Shiratori, M; Yu, Q; Koda, I [Yokohama National University, Yokohama (Japan)

    1997-10-01

    The authors proposed a new practical optimum design method called statistical design support system, which consists of five steps: the effectivity analysis, reanalysis, evaluation of dispersion, the optimiza4ion and evaluation of structural reliability. In this study, the authors applied the present system to analyze and optimum design of an automobile seat frame subjected to crushing. This study should that the present method could be applied to the complex nonlinear problems such as large deformation, material nonlinearity as well as impact problem. It was shown that the optimum design of the seat frame has been solved easily using the present system. 6 refs., 5 figs., 5 tabs.

  13. Factorial experimental design intended for the optimization of the alumina purification conditions

    Science.gov (United States)

    Brahmi, Mounaouer; Ba, Mohamedou; Hidri, Yassine; Hassen, Abdennaceur

    2018-04-01

    The objective of this study was to determine the optimal conditions by using the experimental design methodology for the removal of some impurities associated with the alumina. So, three alumina qualities of different origins were investigated under the same conditions. The application of full-factorial designs on the samples of different qualities of alumina has followed the removal rates of the sodium oxide. However, a factorial experimental design was developed to describe the elimination of sodium oxide associated with the alumina. The experimental results showed that chemical analyze followed by XRF prior treatment of the samples, provided a primary idea concerning these prevailing impurities. Therefore, it appeared that the sodium oxide constituted the largest amount among all impurities. After the application of experimental design, analysis of the effectors different factors and their interactions showed that to have a better result, we should reduce the alumina quantity investigated and by against increase the stirring time for the first two samples, whereas, it was necessary to increase the alumina quantity in the case of the third sample. To expand and improve this research, we should take into account all existing impurities, since we found during this investigation that the levels of partial impurities increased after the treatment.

  14. Inference of missing data and chemical model parameters using experimental statistics

    Science.gov (United States)

    Casey, Tiernan; Najm, Habib

    2017-11-01

    A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.

  15. Overview of design development of FCC-hh Experimental Interaction Regions

    CERN Document Server

    AUTHOR|(CDS)2082479; Abelleira, Jose; Cruz Alaniz, Emilia; Van Riesen-Haupt, Leon; Benedikt, Michael; Besana, Maria Ilaria; Buffat, Xavier; Burkhardt, Helmut; Cerutti, Francesco; Langner, Andy Sven; Martin, Roman; Riegler, Werner; Schulte, Daniel; Tomas Garcia, Rogelio; Appleby, Robert Barrie; Rafique, Haroon; Barranco Garcia, Javier; Pieloni, Tatiana; Boscolo, Manuela; Collamati, Francesco; Nevay, Laurence James; Hofer, Michael

    2017-01-01

    The experimental interaction region (EIR) is one of the key areas that define the performance of the Future Circular Collider. In this overview we will describe the status and the evolution of the design of EIR of FCC-hh, focusing on design of the optics, energy deposition in EIR elements, beam-beam effects and machine detector interface issues.

  16. Second preliminary design of JAERI experimental fusion reactor (JXFR)

    International Nuclear Information System (INIS)

    Sako, Kiyoshi; Tone, Tatsuzo; Seki, Yasushi; Iida, Hiromasa; Yamato, Harumi

    1979-06-01

    Second preliminary design of a tokamak experimental fusion reactor to be built in the near future has been performed. This design covers overall reactor system including plasma characteristics, reactor structure, blanket neutronics radiation shielding, superconducting magnets, neutral beam injector, electric power supply system, fuel recirculating system, reactor cooling and tritium recovery systems and maintenance scheme. Safety analyses of the reactor system have been also performed. This paper gives a brief description of the design as of January, 1979. The feasibility study of raising the power density has been also studied and is shown as appendix. (author)

  17. Conceptual design study of Fusion Experimental Reactor (FY87FER)

    International Nuclear Information System (INIS)

    1988-05-01

    The design study of Fusion Experimental Reactor(FER) which has been proposed to be the next step fusion device has been conducted by JAERI Reactor System Laboratory since 1982 and by FER design team since 1984. This is the final report of the FER design team program and describes the results obtained in FY1987 (partially in FY1986) activities. The contents of this report consist of the reference design which is based on the guideline in FY1986 by the Subcomitees set up in Nuclear Fusion Council of Atomic Energy Commission of Japan, the Low-Physics-Risk reactor design for achieving physics mission more reliably and the system study of FER design candidates including above two designs. (author)

  18. Statistical aspects of quantitative real-time PCR experiment design.

    Science.gov (United States)

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  19. First preliminary design of an experimental fusion reactor

    International Nuclear Information System (INIS)

    1977-09-01

    A preliminary design of a tokamak experimental fusion reactor to be built in the near future is under way. The goals of the reactor are to achieve reactor-level plasma conditions for a sufficiently long operation period and to obtain design, construction and operational experience for the main components of full-scale power reactors. This design covers overall reactor system including plasma characteristics, reactor structure, blanket neutronics, shielding, superconducting magnets, neutral beam injector, electric power supply system, fuel circulating system, reactor cooling system, tritium recovery system and maintenance scheme. The main design parameters are as follows: the reactor fusion power 100 MW, torus radius 6.75 m, plasma radius 1.5 m, first wall radius 1.75 m, toroidal magnet field on axis 6 T, blanket fertile material Li 2 O, coolant He, structural material 316SS and tritium breeding ratio 0.9. (auth.)

  20. Combined data preprocessing and multivariate statistical analysis characterizes fed-batch culture of mouse hybridoma cells for rational medium design.

    Science.gov (United States)

    Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup

    2010-10-01

    We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue.

    Science.gov (United States)

    Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F

    2011-05-20

    Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.

  2. Design of JT-60SA magnets and associated experimental validations

    International Nuclear Information System (INIS)

    Zani, L.; Barabaschi, P.; Peyrot, M.; Meunier, L.; Tomarchio, V.; Duglue, D.; Decool, P.; Torre, A.; Marechal, J.L.; Della Corte, A.; Di Zenobio, A.; Muzzi, L.; Cucchiaro, A.; Turtu, S.; Ishida, S.; Yoshida, K.; Tsuchiya, K.; Kizu, K.; Murakami, H.

    2011-01-01

    In the framework of the JT-60SA project, aiming at upgrading the present JT-60U tokamak toward a fully superconducting configuration, the detailed design phase led to adopt for the three main magnet systems a brand new design. Europe (EU) is expected to provide to Japan (JA) the totality of the toroidal field (TF) magnet system, while JA will provide both Equilibrium field (EF) and Central Solenoid (CS) systems. All magnet designs were optimized trough the past years and entered in parallel into extensive experimentally-based phases of concept validation, which came to maturation in the years 2009 and 2010. For this, all magnet systems were investigated by mean of dedicated samples, e.g. conductor and joint samples designed, manufactured and tested at full scale in ad hoc facilities either in EU or in JA. The present paper, after an overall description of magnet systems layouts, presents in a general approach the different experimental campaigns dedicated to qualification design and manufacture processes of either coils, conductors and electrical joints. The main results with the associated analyses are shown and the main conclusions presented, especially regarding their contribution to consolidate the triggering of magnet mass production. The status of respective manufacturing stages in EU and in JA are also evoked. (authors)

  3. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi

  4. Factorial experimental design for recovering heavy metals from sludge with ion-exchange resin

    International Nuclear Information System (INIS)

    Lee, I.H.; Kuan, Y.-C.; Chern, J.-M.

    2006-01-01

    Wastewaters containing heavy metals are usually treated by chemical precipitation method in Taiwan. This method can remove heavy metals form wastewaters efficiently, but the resultant heavy metal sludge is classified as hazardous solid waste and becomes another environmental problem. If we can remove heavy metals from sludge, it becomes non-hazardous waste and the treatment cost can be greatly reduced. This study aims at using ion-exchange resin to remove heavy metals such as copper, zinc, cadmium, and chromium from sludge generated by a PCB manufacturing plant. Factorial experimental design methodology was used to study the heavy metal removal efficiency. The total metal concentrations in the sludge, resin, and solution phases were measured respectively after 30 min reaction with varying leaching agents (citric acid and nitric acid); ion-exchange resins (Amberlite IRC-718 and IR-120), and temperatures (50 and 70 deg. C). The experimental results and statistical analysis show that a stronger leaching acid and a higher temperature both favor lower heavy metal residues in the sludge. Two-factors and even three-factor interaction effects on the heavy metal sorption in the resin phase are not negligible. The ion-exchange resin plays an important role in the sludge extraction or metal recovery. Empirical regression models were also obtained and used to predict the heavy metal profiles with satisfactory results

  5. Statistical reconstruction for cosmic ray muon tomography.

    Science.gov (United States)

    Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J

    2007-08-01

    Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.

  6. ITER [International Thermonuclear Experimental Reactor] reactor building design study

    International Nuclear Information System (INIS)

    Thomson, S.L.; Blevins, J.D.; Delisle, M.W.

    1989-01-01

    The International Thermonuclear Experimental Reactor (ITER) is at the midpoint of a two-year conceptual design. The ITER reactor building is a reinforced concrete structure that houses the tokamak and associated equipment and systems and forms a barrier between the tokamak and the external environment. It provides radiation shielding and controls the release of radioactive materials to the environment during both routine operations and accidents. The building protects the tokamak from external events, such as earthquakes or aircraft strikes. The reactor building requirements have been developed from the component designs and the preliminary safety analysis. The equipment requirements, tritium confinement, and biological shielding have been studied. The building design in progress requires continuous iteraction with the component and system designs and with the safety analysis. 8 figs

  7. Effect of carboxymethylcellulose on the rheological and filtration properties of bentonite clay samples determined by experimental planning and statistical analysis

    Directory of Open Access Journals (Sweden)

    B. M. A. Brito

    Full Text Available Abstract Over the past few years, considerable research has been conducted using the techniques of mixture delineation and statistical modeling. Through this methodology, applications in various technological fields have been found/optimized, especially in clay technology, leading to greater efficiency and reliability. This work studied the influence of carboxymethylcellulose on the rheological and filtration properties of bentonite dispersions to be applied in water-based drilling fluids using experimental planning and statistical analysis for clay mixtures. The dispersions were prepared according to Petrobras standard EP-1EP-00011-A, which deals with the testing of water-based drilling fluid viscosifiers for oil prospecting. The clay mixtures were transformed into sodic compounds, and carboxymethylcellulose additives of high and low molar mass were added, in order to improve their rheology and filtrate volume. Experimental planning and statistical analysis were used to verify the effect. The regression models were calculated for the relation between the compositions and the following rheological properties: apparent viscosity, plastic viscosity, and filtrate volume. The significance and validity of the models were confirmed. The results showed that the 3D response surfaces of the compositions with high molecular weight carboxymethylcellulose added were the ones that most contributed to the rise in apparent viscosity and plastic viscosity, and that those with low molecular weight were the ones that most helped in the reduction of the filtrate volume. Another important observation is that the experimental planning and statistical analysis can be used as an important auxiliary tool to optimize the rheological properties and filtrate volume of bentonite clay dispersions for use in drilling fluids when carboxymethylcellulose is added.

  8. Statistical examination of particle in a turbulent, non-dilute particle suspension flow experimental measurements

    International Nuclear Information System (INIS)

    Souza, R.C.; Jones, B.G.

    1986-01-01

    An experimental study of particles suspended in fully developed turbulent water flow in a vertical pipe was done. Three series of experiments were conducted to investigate the statistical behaviour of particles in nondilute turbulent suspension flow, for two particle densities and particle sizes, and for several particle volume loadings ranging from 0 to 1 percent. The mean free fall velocity of the particles was determined at these various particle volume loadings, and the phenomenon of cluster formation was observed. The precise volume loading which gives the maximum relative settling velocity was observed to depend on particle density and size. (E.G.) [pt

  9. Conceptual design study of fusion experimental reactor (FER)

    International Nuclear Information System (INIS)

    1986-11-01

    Since 1980 the design study has been conducted at JAERI for the Fusion Experimental Reactor (FER) which has been proposed to be the next machine to JT-60 in the Japanese long term program of fusion reactor development. During two years from 1984 to 1985 FER concept was reviewed and redesigned. This report is the summary of the results obtained in the review and redesign activities in 1984 and 85. In the first year FER concept was discussed again and its frame work was reestablished. According to the new frame work the major reactor components of FER were designed. In the second year the whole plant system design including plant layout plan was conducted as well as the more detailed design analysis of the reactor conponents. The newly established frame for FER design is as follows: 1) Plasma : Self-ignition. 2) Operation scenario : Quasi-steady state operation with long burn pulse. 3) Neutron fluence on the first wall : 0.3 MWY/M 2 . 4) Blanket : Non-tritium breeding blanket with test modules for breeding blanket development. 5) Magnets : Superconducting Magnets. (author)

  10. Catalytic Cracking of Palm Oil Over Zeolite Catalysts: Statistical Approach

    Directory of Open Access Journals (Sweden)

    F. A. A. Twaiq and S. Bhatia

    2012-08-01

    Full Text Available The catalytic cracking of palm oil was conducted in a fixed bed micro-reactor over HZSM-5, zeolite ? and ultrastable Y (USY zeolite catalysts. The objective of the present investigation was to study the effect of cracking reaction variables such as temperature, weight hourly space velocity, catalyst pore size and type of palm oil feed of different molecular weight on the conversion, yield of hydrocarbons in gasoline boiling range and BTX aromatics in the organic liquid product.  Statistical Design of Experiment (DOE with 24 full factorial design was used in experimentation at the first stage.  The nonlinear model and Response Surface Methodology (RSM were utilized in the second stage of experimentation to obtain the optimum values of the variables for maximum yields of hydrocarbons in gasoline boiling range and aromatics.  The HZSM-5 showed the best performance amongst the three catalysts tested.  At 623 K and WHSV of 1 h-1, the highest experimental yields of gasoline and aromatics were 28.3 wt.% and 27 wt.%, respectively over the HZSM-5 catalyst.  For the same catalyst, the statistical model predicted that the optimum yield of gasoline was 28.1 wt.% at WHSV of 1.75 h-1 and 623 K.  The predicted optimum yield of gasoline was 25.5 wt.% at 623 K and WHSV of 1 h-1.KEY WORDS: Catalytic Cracking, Palm Oil, Zeolite, Design Of Experiment, Response Surface Methodology.

  11. Statistical method for the determination of the ignition energy of dust cloud - experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, S.; Lebecki, K.; Gillard, P.; Youinou, L.; Baudry, G. [University of Orleans, Bourges (France)

    2010-05-15

    Powdery materials such as metallic or polymer powders play a considerable role in many industrial processes. Their use requires the introduction of preventive safeguard to control the plants safety. The mitigation of an explosion hazard, according to the ATEX 137 Directive (1999/92/EU), requires the assessment of the dust ignition sensitivity. PRISME laboratory (University of Orleans) has developed an experimental set-up and methodology, using the Langlie test, for the quick determination of the explosion sensitivity of dusts. This method requires only 20 shots and ignition sensitivity is evaluated through the E{sub 50} (energy with an ignition probability of 0.5) A Hartmann tube, with a volume of 1.3l, was designed and built. Many results on the energy ignition thresholds of partially oxidised aluminium were obtained using this experimental device and compared to literature. E-50 evolution is the same as MIE but their respective values are different and MIE is lower than E{sub 50} however the link between E{sub 50} and MIE has not been elucidated In this paper, the Langlie method is explained in detail for the determination of the parameters (mean value E{sub 50} and standard deviation {sigma}) of the associated statistic law. The ignition probability versus applied energy is firstly measured for Lycopodium in order to validate the method A comparison between the normal and the lognormal law was achieved and the best fit was obtained with the lognormal law. In a second part, the Langlie test was performed on different dusts such as aluminium, cornstarch, lycopodium, coal, and PA12 in order to determine E-50 and {sigma} for each dust. The energies E{sub 05} and E{sub 10} corresponding respectively to an ignition probability of 0.05 and 0.1 are determined with the lognormal law and compared to MIE find in literature. E{sub 05} and E{sub 10} values of ignition energy were found to be very close and were in good agreement with MIE in the literature.

  12. Measuring and Advancing Experimental Design Ability in an Introductory Course without Altering Existing Lab Curriculum

    Directory of Open Access Journals (Sweden)

    Ryan A. Shanks

    2017-05-01

    Full Text Available Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.

  13. Design and experimental study of a solar system for heating water ...

    African Journals Online (AJOL)

    This work presents a design and an experimental study of a linear Fresnel reflector solar with trapezoidal cavity. This prototype is used for heating the tap water. The reflector was designed, constructed and tested in mechanical engineering department, University of Blida 1, Algeria. Various combinations of reflecting mirrors ...

  14. Application of D-optimal experimental design method to optimize the formulation of O/W cosmetic emulsions.

    Science.gov (United States)

    Djuris, J; Vasiljevic, D; Jokic, S; Ibric, S

    2014-02-01

    This study investigates the application of D-optimal mixture experimental design in optimization of O/W cosmetic emulsions. Cetearyl glucoside was used as a natural, biodegradable non-ionic emulsifier in the relatively low concentration (1%), and the mixture of co-emulsifiers (stearic acid, cetyl alcohol, stearyl alcohol and glyceryl stearate) was used to stabilize the formulations. To determine the optimal composition of co-emulsifiers mixture, D-optimal mixture experimental design was used. Prepared emulsions were characterized with rheological measurements, centrifugation test, specific conductivity and pH value measurements. All prepared samples appeared as white and homogenous creams, except for one homogenous and viscous lotion co-stabilized by stearic acid alone. Centrifugation testing revealed some phase separation only in the case of sample co-stabilized using glyceryl stearate alone. The obtained pH values indicated that all samples expressed mild acid value acceptable for cosmetic preparations. Specific conductivity values are attributed to the multiple phases O/W emulsions with high percentages of fixed water. Results of the rheological measurements have shown that the investigated samples exhibited non-Newtonian thixotropic behaviour. To determine the influence of each of the co-emulsifiers on emulsions properties, the obtained results were evaluated by the means of statistical analysis (ANOVA test). On the basis of comparison of statistical parameters for each of the studied responses, mixture reduced quadratic model was selected over the linear model implying that interactions between co-emulsifiers play the significant role in overall influence of co-emulsifiers on emulsions properties. Glyceryl stearate was found to be the dominant co-emulsifier affecting emulsions properties. Interactions between the glyceryl stearate and other co-emulsifiers were also found to significantly influence emulsions properties. These findings are especially important

  15. A comparative study of two statistical approaches for the analysis of real seismicity sequences and synthetic seismicity generated by a stick-slip experimental model

    Science.gov (United States)

    Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano

    2015-04-01

    The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.

  16. Citizen Data and Official Statistics: Background Document to a Collaborative Workshop

    DEFF Research Database (Denmark)

    Grommé, Francisca; Ustek, Funda; Ruppert, Evelyn

    2017-01-01

    This working paper was written in preparation for a collaborative workshop organised for statisticians, social scientists, information and app designers and other participants inside and outside academia. The autumn 2017 workshop aimed to develop the main principles for a citizen data app...... for official statistics. Through this work we sought to conceive of a new regime of data collection in official statistics through different devices. How can we capture citizens’ meanings and intentions when they produce data? Can we develop ‘smart’ methods that do not rely on cooperating with, and data...... generated by, large tech companies, but by developing methods and data co-produced with citizens? Towards addressing these issues we developed four key concepts outlined in this document: experimentalism, citizen data, smart statistics and privacy by design. We introduced these concepts to facilitate shared...

  17. Targeting Change: Assessing a Faculty Learning Community Focused on Increasing Statistics Content in Life Science Curricula

    Science.gov (United States)

    Parker, Loran Carleton; Gleichsner, Alyssa M.; Adedokun, Omolola A.; Forney, James

    2016-01-01

    Transformation of research in all biological fields necessitates the design, analysis and, interpretation of large data sets. Preparing students with the requisite skills in experimental design, statistical analysis, and interpretation, and mathematical reasoning will require both curricular reform and faculty who are willing and able to integrate…

  18. TIBER: Tokamak Ignition/Burn Experimental Research. Final design report

    International Nuclear Information System (INIS)

    Henning, C.D.; Logan, B.G.; Barr, W.L.

    1985-01-01

    The Tokamak Ignition/Burn Experimental Research (TIBER) device is the smallest superconductivity tokamak designed to date. In the design plasma shaping is used to achieve a high plasma beta. Neutron shielding is minimized to achieve the desired small device size, but the superconducting magnets must be shielded sufficiently to reduce the neutron heat load and the gamma-ray dose to various components of the device. Specifications of the plasma-shaping coil, the shielding, coaling, requirements, and heating modes are given. 61 refs., 92 figs., 30 tabs

  19. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  20. D-Optimal mixture experimental design for stealth biodegradable crosslinked docetaxel-loaded poly-ε-caprolactone nanoparticles manufactured by dispersion polymerization.

    Science.gov (United States)

    Ogunwuyi, O; Adesina, S; Akala, E O

    2015-03-01

    We report here our efforts on the development of stealth biodegradable crosslinked poly-ε-caprolactone nanoparticles by free radical dispersion polymerization suitable for the delivery of bioactive agents. The uniqueness of the dispersion polymerization technique is that it is surfactant free, thereby obviating the problems known to be associated with the use of surfactants in the fabrication of nanoparticles for biomedical applications. Aided by a statistical software for experimental design and analysis, we used D-optimal mixture statistical experimental design to generate thirty batches of nanoparticles prepared by varying the proportion of the components (poly-ε-caprolactone macromonomer, crosslinker, initiators and stabilizer) in acetone/water system. Morphology of the nanoparticles was examined using scanning electron microscopy (SEM). Particle size and zeta potential were measured by dynamic light scattering (DLS). Scheffe polynomial models were generated to predict particle size (nm) and particle surface zeta potential (mV) as functions of the proportion of the components. Solutions were returned from simultaneous optimization of the response variables for component combinations to (a) minimize nanoparticle size (small nanoparticles are internalized into disease organs easily, avoid reticuloendothelial clearance and lung filtration) and (b) maximization of the negative zeta potential values, as it is known that, following injection into the blood stream, nanoparticles with a positive zeta potential pose a threat of causing transient embolism and rapid clearance compared to negatively charged particles. In vitro availability isotherms show that the nanoparticles sustained the release of docetaxel for 72 to 120 hours depending on the formulation. The data show that nanotechnology platforms for controlled delivery of bioactive agents can be developed based on the nanoparticles.

  1. A new mathematical approach for the estimation of the AUC and its variability under different experimental designs in preclinical studies.

    Science.gov (United States)

    Navarro-Fontestad, Carmen; González-Álvarez, Isabel; Fernández-Teruel, Carlos; Bermejo, Marival; Casabó, Vicente Germán

    2012-01-01

    The aim of the present work was to develop a new mathematical method for estimating the area under the curve (AUC) and its variability that could be applied in different preclinical experimental designs and amenable to be implemented in standard calculation worksheets. In order to assess the usefulness of the new approach, different experimental scenarios were studied and the results were compared with those obtained with commonly used software: WinNonlin® and Phoenix WinNonlin®. The results do not show statistical differences among the AUC values obtained by both procedures, but the new method appears to be a better estimator of the AUC standard error, measured as the coverage of 95% confidence interval. In this way, the new proposed method demonstrates to be as useful as WinNonlin® software when it was applicable. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  3. Interconnect fatigue design for terrestrial photovoltaic modules

    Science.gov (United States)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  4. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)

    Science.gov (United States)

    Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  5. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    Science.gov (United States)

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  6. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies.

    Science.gov (United States)

    Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander

    2017-09-09

    The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  7. Experimental Device for Learning of Logical Circuit Design using Integrated Circuits

    OpenAIRE

    石橋, 孝昭

    2012-01-01

    This paper presents an experimental device for learning of logical circuit design using integrated circuits and breadboards. The experimental device can be made at a low cost and can be used for many subjects such as logical circuits, computer engineering, basic electricity, electrical circuits and electronic circuits. The proposed device is effective to learn the logical circuits than the usual lecture.

  8. Design study of blanket structure for tokamak experimental fusion reactor

    International Nuclear Information System (INIS)

    1979-11-01

    Design study of the blanket structure for JAERI Experimental Fusion Reactor (JXFR) has been carried out. Studied here were fabrication and testing of the blanket structure (blanket cells, blanket rings, piping and blanket modules), assembly and disassembly of the blanket module, and monitering and testing technique. Problems in design and fabrication of the blanket structure could be revealed. Research and development problems for the future were also disclosed. (author)

  9. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  10. Inactivation of Staphylococcus aureus in raw salmon with supercritical CO2 using experimental design

    Directory of Open Access Journals (Sweden)

    Mônica CUPPINI

    2016-01-01

    Full Text Available Abstract Considering the microbial safety of consumption of raw foods (Asian food, this study aimed to explore the inactivation S. aureus in raw salmon by supercritical CO2 treatment (SC-CO2. For this purpose, experimental design methodology was employed as a tool to evaluate the effects of pressure (120-220 bar, the depressurization rate (10 to 100 bar.min–1 and the salmon:CO2 mass relation (1:0.2 to 1:1.0. It was observed that the pressure and the depressurization rate was statistically significant, i.e. the higher the system pressure and depressurization rate, the greater the microbial inactivation. The salmon: CO2 mass relation did not influence the S. aureus inactivation in raw salmon. There was a total reduction in S. aureus with 225 bar, a depressurizing rate of 100 bar.min–1, a salmon: CO2 mass relation of 1:0.6, for 2 hours at 33 °C.

  11. Measurements of experimental precision for trials with cowpea (Vigna unguiculata L. Walp.) genotypes.

    Science.gov (United States)

    Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G

    2016-05-09

    The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.

  12. Matching of experimental and statistical-model thermonuclear reaction rates at high temperatures

    International Nuclear Information System (INIS)

    Newton, J. R.; Longland, R.; Iliadis, C.

    2008-01-01

    We address the problem of extrapolating experimental thermonuclear reaction rates toward high stellar temperatures (T>1 GK) by using statistical model (Hauser-Feshbach) results. Reliable reaction rates at such temperatures are required for studies of advanced stellar burning stages, supernovae, and x-ray bursts. Generally accepted methods are based on the concept of a Gamow peak. We follow recent ideas that emphasized the fundamental shortcomings of the Gamow peak concept for narrow resonances at high stellar temperatures. Our new method defines the effective thermonuclear energy range (ETER) by using the 8th, 50th, and 92nd percentiles of the cumulative distribution of fractional resonant reaction rate contributions. This definition is unambiguous and has a straightforward probability interpretation. The ETER is used to define a temperature at which Hauser-Feshbach rates can be matched to experimental rates. This matching temperature is usually much higher compared to previous estimates that employed the Gamow peak concept. We suggest that an increased matching temperature provides more reliable extrapolated reaction rates since Hauser-Feshbach results are more trustwhorthy the higher the temperature. Our ideas are applied to 21 (p,γ), (p,α), and (α,γ) reactions on A=20-40 target nuclei. For many of the cases studied here, our extrapolated reaction rates at high temperatures differ significantly from those obtained using the Gamow peak concept

  13. Human in vitro 3D co-culture model to engineer vascularized bone-mimicking tissues combining computational tools and statistical experimental approach.

    Science.gov (United States)

    Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo

    2016-01-01

    The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Statistical analysis of experimental multifragmentation events in 64Zn+112Sn at 40 MeV/nucleon

    Science.gov (United States)

    Lin, W.; Zheng, H.; Ren, P.; Liu, X.; Huang, M.; Wada, R.; Chen, Z.; Wang, J.; Xiao, G. Q.; Qu, G.

    2018-04-01

    A statistical multifragmentation model (SMM) is applied to the experimentally observed multifragmentation events in an intermediate heavy-ion reaction. Using the temperature and symmetry energy extracted from the isobaric yield ratio (IYR) method based on the modified Fisher model (MFM), SMM is applied to the reaction 64Zn+112Sn at 40 MeV/nucleon. The experimental isotope distribution and mass distribution of the primary reconstructed fragments are compared without afterburner and they are well reproduced. The extracted temperature T and symmetry energy coefficient asym from SMM simulated events, using the IYR method, are also consistent with those from the experiment. These results strongly suggest that in the multifragmentation process there is a freezeout volume, in which the thermal and chemical equilibrium is established before or at the time of the intermediate-mass fragments emission.

  15. Design of Passive Acoustic Wave Shaping Devices and Their Experimental Validation

    DEFF Research Database (Denmark)

    Christiansen, Rasmus Ellebæk; Sigmund, Ole; Fernandez Grande, Efren

    We discuss a topology optimization based approach for designing passive acoustic wave shaping devices and demonstrate its application to; directional sound emission [1], sound focusing and wave splitting. Optimized devices, numerical and experimental results are presented and benchmarked against...... other designs proposed in the literature. We focus on design problems where the size of the device is on the order of the wavelength, a problematic region for traditional design methods, such as ray tracing.The acoustic optimization problem is formulated in the frequency domain and modeled...

  16. Experimental concept and design of DarkLight, a search for a heavy photon

    International Nuclear Information System (INIS)

    Cowan, Ray F.

    2013-01-01

    This talk gives an overview of the DarkLight experimental concept: a search for a heavy photon A′ in the 10-90 MeV/c 2 mass range. After briefly describing the theoretical motivation, the talk focuses on the experimental concept and design. Topics include operation using a half-megawatt, 100 MeV electron beam at the Jefferson Lab FEL, detector design and performance, and expected backgrounds estimated from beam tests and Monte Carlo simulations

  17. Training reactor deployment. Advanced experimental course on designing new reactor cores

    International Nuclear Information System (INIS)

    Skoda, Radek

    2009-01-01

    Czech Technical University in Prague (CTU) operating its training nuclear reactor VR1, in cooperation with the North West University of South Africa (NWU), is applying for accreditation of the experimental training course ''Advanced experimental course on designing the new reactor core'' that will guide the students, young nuclear engineering professionals, through designing, calculating, approval, and assembling a new nuclear reactor core. Students, young professionals from the South African nuclear industry, face the situation when a new nuclear reactor core is to be build from scratch. Several reactor core design options are pre-calculated. The selected design is re-calculated by the students, the result is then scrutinized by the regulator and, once all the analysis is approved, physical dismantling of the current core and assembling of the new core is done by the students, under a close supervision of the CTU staff. Finally the reactor is made critical with the new core. The presentation focuses on practical issues of such a course, desired reactor features and namely pedagogical and safety aspects. (orig.)

  18. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  19. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  20. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-01-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the resutls on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monople giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excelent agreement with recent experimental data, showing that the decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  1. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-02-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the results on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monopole giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excellent agreement with recent experimental data, showing that decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  2. Ti film deposition process of a plasma focus: Study by an experimental design

    Directory of Open Access Journals (Sweden)

    M. J. Inestrosa-Izurieta

    2017-10-01

    Full Text Available The plasma generated by plasma focus (PF devices have substantially different physical characteristics from another plasma, energetic ions and electrons, compared with conventional plasma devices used for plasma nanofabrication, offering new and unique opportunities in the processing and synthesis of Nanomaterials. This article presents the use of a plasma focus of tens of joules, PF-50J, for the deposition of materials sprayed from the anode by the plasma dynamics in the axial direction. This work focuses on the determination of the most significant effects of the technological parameters of the system on the obtained depositions through the use of a statistical experimental design. The results allow us to give a qualitative understanding of the Ti film deposition process in our PF device depending on four different events provoked by the plasma dynamics: i an electric erosion of the outer material of the anode; ii substrate ablation generating an interlayer; iii electron beam deposition of material from the center of the anode; iv heat load provoking clustering or even melting of the deposition surface.

  3. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  4. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  5. An experimental study of noise in mid-infrared quantum cascade lasers of different designs

    Science.gov (United States)

    Schilt, Stéphane; Tombez, Lionel; Tardy, Camille; Bismuto, Alfredo; Blaser, Stéphane; Maulini, Richard; Terazzi, Romain; Rochat, Michel; Südmeyer, Thomas

    2015-04-01

    We present an experimental study of noise in mid-infrared quantum cascade lasers (QCLs) of different designs. By quantifying the high degree of correlation occurring between fluctuations of the optical frequency and voltage between the QCL terminals, we show that electrical noise is a powerful and simple mean to study noise in QCLs. Based on this outcome, we investigated the electrical noise in a large set of 22 QCLs emitting in the range of 7.6-8 μm and consisting of both ridge-waveguide and buried-heterostructure (BH) lasers with different geometrical designs and operation parameters. From a statistical data processing based on an analysis of variance, we assessed that ridge-waveguide lasers have a lower noise than BH lasers. Our physical interpretation is that additional current leakages or spare injection channels occur at the interface between the active region and the lateral insulator in the BH geometry, which induces some extra noise. In addition, Schottky-type contacts occurring at the interface between the n-doped regions and the lateral insulator, i.e., iron-doped InP, are also believed to be a potential source of additional noise in some BH lasers, as observed from the slight reduction in the integrated voltage noise observed at the laser threshold in several BH-QCLs.

  6. Design and Implementation of an Experimental Cataloging Advisor--Mapper.

    Science.gov (United States)

    Ercegovac, Zorana; Borko, Harold

    1992-01-01

    Describes the design of an experimental computer-aided cataloging advisor, Mapper, that was developed to help novice users with the descriptive cataloging of single-sheet maps from U.S. publishers. The human-computer interface is considered, the use of HyperCard is discussed, the knowledge base is explained, and assistance screens are described.…

  7. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies

    Directory of Open Access Journals (Sweden)

    Anne-Laure Boulesteix

    2017-09-01

    Full Text Available Abstract Background The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly “evidence-based”. Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. Main message In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of “evidence-based” statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. Conclusion We suggest that benchmark studies—a method of assessment of statistical methods using real-world datasets—might benefit from adopting (some concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  8. Experimental study of elementary collection efficiency of aerosols by spray: Design of the experimental device

    Energy Technology Data Exchange (ETDEWEB)

    Ducret, D.; Vendel, J.; Garrec. S.L.

    1995-02-01

    The safety of a nuclear power plant containment building, in which pressure and temperature could increase because of a overheating reactor accident, can be achieved by spraying water drops. The spray reduces the pressure and the temperature levels by condensation of steam on cold water drops. The more stringent thermodynamic conditions are a pressure of 5.10{sup 5} Pa (due to steam emission) and a temperature of 413 K. Moreover its energy dissipation function, the spray leads to the washout of fission product particles emitted in the reactor building atmosphere. The present study includes a large program devoted to the evaluation of realistic washout rates. The aim of this work is to develop experiments in order to determine the collection efficiency of aerosols by a single drop. To do this, the experimental device has to be designed with fundamental criteria:-Thermodynamic conditions have to be representative of post-accident atmosphere. Thermodynamic equilibrium has to be attained between the water drops and the gaseous phase. Thermophoretic, diffusiophoretic and mechanical effects have to be studied independently. Operating conditions have to be homogenous and constant during each experiment. This paper presents the design of the experimental device. In practice, the consequences on the design of each of the criteria given previously and the necessity of being representative of the real conditions will be described.

  9. Investigation on gas medium parameters for an ArF excimer laser through orthogonal experimental design

    Science.gov (United States)

    Song, Xingliang; Sha, Pengfei; Fan, Yuanyuan; Jiang, R.; Zhao, Jiangshan; Zhou, Yi; Yang, Junhong; Xiong, Guangliang; Wang, Yu

    2018-02-01

    Due to complex kinetics of formation and loss mechanisms, such as ion-ion recombination reaction, neutral species harpoon reaction, excited state quenching and photon absorption, as well as their interactions, the performance behavior of different laser gas medium parameters for excimer laser varies greatly. Therefore, the effects of gas composition and total gas pressure on excimer laser performance attract continual research studies. In this work, orthogonal experimental design (OED) is used to investigate quantitative and qualitative correlations between output laser energy characteristics and gas medium parameters for an ArF excimer laser with plano-plano optical resonator operation. Optimized output laser energy with good pulse to pulse stability can be obtained effectively by proper selection of the gas medium parameters, which makes the most of the ArF excimer laser device. Simple and efficient method for gas medium optimization is proposed and demonstrated experimentally, which provides a global and systematic solution. By detailed statistical analysis, the significance sequence of relevant parameter factors and the optimized composition for gas medium parameters are obtained. Compared with conventional route of varying single gas parameter factor sequentially, this paper presents a more comprehensive way of considering multivariables simultaneously, which seems promising in striking an appropriate balance among various complicated parameters for power scaling study of an excimer laser.

  10. An experimental design approach for optimization of spectrophotometric method for estimation of cefixime trihydrate using ninhydrin as derivatizing reagent in bulk and pharmaceutical formulation

    Directory of Open Access Journals (Sweden)

    Yogita B. Wani

    2017-01-01

    Full Text Available The aim of the present work is to use experimental design to screen and optimize experimental variables for developing a spectrophotometric method for determining cefixime trihydrate content using ninhydrin as a derivatizing reagent. The method is based on the reaction of the amino group of cefixime with ninhydrin in an alkaline medium to form a yellow-colored derivative (λmax 436 nm. A two-level full factorial design was utilized to screen the effect of ninhydrin reagent concentration (X1, volume of ninhydrin reagent (X2, heating temperature (X3 and heating time (X4 on the formation of the cefixime–ninhydrin complex Y (absorbance. One way ANOVA and Pareto ranking analyses have shown that the ninhydrin reagent concentration (X1, volume of ninhydrin reagent (X2 and heating temperature (X3 were statistically significant factors (P < 0.05 affecting the formation of the cefixime–ninhydrin complex Y (absorbance. A Box-Behnken experimental design with response surface methodology was then utilized to evaluate the main, interaction and quadratic effects of these three factors on the selected response. With the help of a response surface plot and contour plot the optimum values of the selected factors were determined and used for further experiments. These values were a ninhydrin reagent concentration (X1 of 0.2% w/v, volume of ninhydrin reagent (X2 of 1 mL and heating temperature (X3 of 80 °C. The proposed method was validated according to the ICH Q2 (R1 method validation guidelines. The results of the present study have clearly shown that an experimental design concept may be effectively applied to the optimization of a spectrophotometric method for estimating the cefixime trihydrate content in bulk and pharmaceutical formulation with the least number of experimental runs possible.

  11. Spent Fuel Transportation Package Performance Study - Experimental Design Challenges

    International Nuclear Information System (INIS)

    Snyder, A. M.; Murphy, A. J.; Sprung, J. L.; Ammerman, D. J.; Lopez, C.

    2003-01-01

    Numerous studies of spent nuclear fuel transportation accident risks have been performed since the late seventies that considered shipping container design and performance. Based in part on these studies, NRC has concluded that the level of protection provided by spent nuclear fuel transportation package designs under accident conditions is adequate. [1] Furthermore, actual spent nuclear fuel transport experience showcase a safety record that is exceptional and unparalleled when compared to other hazardous materials transportation shipments. There has never been a known or suspected release of the radioactive contents from an NRC-certified spent nuclear fuel cask as a result of a transportation accident. In 1999 the United States Nuclear Regulatory Commission (NRC) initiated a study, the Package Performance Study, to demonstrate the performance of spent fuel and spent fuel packages during severe transportation accidents. NRC is not studying or testing its current regulations, a s the rigorous regulatory accident conditions specified in 10 CFR Part 71 are adequate to ensure safe packaging and use. As part of this study, NRC currently plans on using detailed modeling followed by experimental testing to increase public confidence in the safety of spent nuclear fuel shipments. One of the aspects of this confirmatory research study is the commitment to solicit and consider public comment during the scoping phase and experimental design planning phase of this research

  12. Publishing Single-Case Research Design Studies That Do Not Demonstrate Experimental Control

    Science.gov (United States)

    Tincani, Matt; Travers, Jason

    2018-01-01

    Demonstration of experimental control is considered a hallmark of high-quality single-case research design (SCRD). Studies that fail to demonstrate experimental control may not be published because researchers are unwilling to submit these papers for publication and journals are unlikely to publish negative results (i.e., the file drawer effect).…

  13. Design standards for experimental and field studies to evaluate diagnostic accuracy of tests for infectious diseases in aquatic animals.

    Science.gov (United States)

    Laurin, E; Thakur, K K; Gardner, I A; Hick, P; Moody, N J G; Crane, M S J; Ernst, I

    2018-05-01

    Design and reporting quality of diagnostic accuracy studies (DAS) are important metrics for assessing utility of tests used in animal and human health. Following standards for designing DAS will assist in appropriate test selection for specific testing purposes and minimize the risk of reporting biased sensitivity and specificity estimates. To examine the benefits of recommending standards, design information from published DAS literature was assessed for 10 finfish, seven mollusc, nine crustacean and two amphibian diseases listed in the 2017 OIE Manual of Diagnostic Tests for Aquatic Animals. Of the 56 DAS identified, 41 were based on field testing, eight on experimental challenge studies and seven on both. Also, we adapted human and terrestrial-animal standards and guidelines for DAS structure for use in aquatic animal diagnostic research. Through this process, we identified and addressed important metrics for consideration at the design phase: study purpose, targeted disease state, selection of appropriate samples and specimens, laboratory analytical methods, statistical methods and data interpretation. These recommended design standards for DAS are presented as a checklist including risk-of-failure points and actions to mitigate bias at each critical step. Adherence to standards when designing DAS will also facilitate future systematic review and meta-analyses of DAS research literature. © 2018 John Wiley & Sons Ltd.

  14. Experimental use of iteratively designed rotation invariant correlation filters

    International Nuclear Information System (INIS)

    Sweeney, D.W.; Ochoa, E.; Schils, G.F.

    1987-01-01

    Iteratively designed filters are incorporated into an optical correlator for position, rotation, and intensity invariant recognition of target images. The filters exhibit excellent discrimination because they are designed to contain full information about the target image. Numerical simulations and experiments demonstrate detection of targets that are corrupted with random noise (SNR≅0.5) and also partially obscured by other objects. The complex valued filters are encoded in a computer generated hologram and fabricated directly using an electron-beam system. Experimental results using a liquid crystal spatial light modulator for real-time input show excellent agreement with analytical and numerical computations

  15. Reference design (MK-I and MK-II) for experimental multi-purpose VHTR

    International Nuclear Information System (INIS)

    Miyamoto, Yoshiaki; Suzuki, Kunihiko; Sato, Sadao

    1975-10-01

    This report summarizes the results of a study on thermal and mechanical performances of the core, which are obtained in course of reference design (Mk-I and Mk-II) for the experimental multi-purpose VHTR: (1) Design criteria, design methods and design data. These bases are also discussed in order to refer in the case of proceeding a next design work. (2) The results of performance analysis such as the initial core and its prediction for the irradiated core. (auth.)

  16. Variability aware compact model characterization for statistical circuit design optimization

    Science.gov (United States)

    Qiao, Ying; Qian, Kun; Spanos, Costas J.

    2012-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose an efficient variabilityaware compact model characterization methodology based on the linear propagation of variance. Hierarchical spatial variability patterns of selected compact model parameters are directly calculated from transistor array test structures. This methodology has been implemented and tested using transistor I-V measurements and the EKV-EPFL compact model. Calculation results compare well to full-wafer direct model parameter extractions. Further studies are done on the proper selection of both compact model parameters and electrical measurement metrics used in the method.

  17. Experimental study of liquid-metal target designs of accelerating-controlled systems

    International Nuclear Information System (INIS)

    Iarmonov, Mikhail; Makhov, Kirill; Novozhilova, Olga; Meluzov, A.G.; Beznosov, A.V.

    2011-01-01

    Models of a liquid-metal target of an accelerator-controlled system have been experimentally studied at the Nizhny Novgorod State Technical University to develop an optimal design of the flow part of the target. The main explored variants of liquid-metal targets are: Design with a diaphragm (firm-and-impervious plug) mounted on the pipe tap of particle transport from the accelerator cavity to the working cavity of the liquid-metal target. Design without a diaphragm on the pipe tab of particle transport from the accelerator. The study was carried out in a high-temperature liquid-metal test bench under the conditions close to full-scale ones: the temperature of the eutectic lead-bismuth alloy was 260degC - 400degC, the coolant mass flow was 5-80 t/h, and the rarefaction in the gas cavity was 10 5 Pa, the coefficient of geometric similarity equal to 1. The experimental studies of hydrodynamic characteristics of flow parts in the designs of targets under full-scale conditions indicated high efficiency of a target in triggering, operating, and deactivating modes. Research and technology instructions for designs of the flow part of the liquid-metal target, the target design as a whole, and the target circuit of accelerator-controlled systems were formulated as a result of the studies. (author)

  18. Application of Iterative Robust Model-based Optimal Experimental Design for the Calibration of Biocatalytic Models

    DEFF Research Database (Denmark)

    Van Daele, Timothy; Gernaey, Krist V.; Ringborg, Rolf Hoffmeyer

    2017-01-01

    The aim of model calibration is to estimate unique parameter values from available experimental data, here applied to a biocatalytic process. The traditional approach of first gathering data followed by performing a model calibration is inefficient, since the information gathered during...... experimentation is not actively used to optimise the experimental design. By applying an iterative robust model-based optimal experimental design, the limited amount of data collected is used to design additional informative experiments. The algorithm is used here to calibrate the initial reaction rate of an ω......-transaminase catalysed reaction in a more accurate way. The parameter confidence region estimated from the Fisher Information Matrix is compared with the likelihood confidence region, which is a more accurate, but also a computationally more expensive method. As a result, an important deviation between both approaches...

  19. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    Science.gov (United States)

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  1. Statistical Shape Analysis of the Human Ear Canal with Application to In-the-Ear Hearing Aid Design

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold

    2004-01-01

    This thesis is about the statistical shape analysis of the human ear canal with application to the mechanical design of in-the-ear hearing aids. Initially, it is described how a statistical shape model of the human ear canal is built based on a training set of laser-scanned ear impressions. A thin...

  2. Experimental design of natural and accellerated bone and wood ageing

    DEFF Research Database (Denmark)

    Facorellis, Y.; Pournou, A.; Richter, Jane

    2015-01-01

    This paper presents the experimental design for natural and accelerated ageing of bone and wood samples found in museum conditions that was conceived as part of the INVENVORG (Thales Research Funding Program – NRSF) investigating the effects of the environmental factors on natural organic materials....

  3. Optimization of fast disintegration tablets using pullulan as diluent by central composite experimental design

    OpenAIRE

    Patel, Dipil; Chauhan, Musharraf; Patel, Ravi; Patel, Jayvadan

    2012-01-01

    The objective of this work was to apply central composite experimental design to investigate main and interaction effect of formulation parameters in optimizing novel fast disintegration tablets formulation using pullulan as diluents. Face centered central composite experimental design was employed to optimize fast disintegration tablet formulation. The variables studied were concentration of diluents (pullulan, X1), superdisintigrant (sodium starch glycolate, X2), and direct compression aid ...

  4. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    Science.gov (United States)

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  5. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    Directory of Open Access Journals (Sweden)

    Oberg Ann L

    2012-11-01

    Full Text Available Abstract Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  6. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    Science.gov (United States)

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  7. Statistical properties of SASE FEL radiation: experimental results from the VUV FEL at the TESLA test facility at DESY

    International Nuclear Information System (INIS)

    Yurkov, M.V.

    2002-01-01

    This paper presents an experimental study of the statistical properties of the radiation from a SASE FEL. The experiments were performed at the TESLA Test Facility VUV SASE FEL at DESY operating in a high-gain linear regime with a gain of about 10 6 . It is shown that fluctuations of the output radiation energy follows a gamma-distribution. We also measured for the first time the probability distribution of SASE radiation energy after a narrow-band monochromator. The experimental results are in good agreement with theoretical predictions, the energy fluctuations after the monochromator follow a negative exponential distribution

  8. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    Science.gov (United States)

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  9. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  10. Overview of International Thermonuclear Experimental Reactor (ITER) engineering design activities*

    Science.gov (United States)

    Shimomura, Y.

    1994-05-01

    The International Thermonuclear Experimental Reactor (ITER) [International Thermonuclear Experimental Reactor (ITER) (International Atomic Energy Agency, Vienna, 1988), ITER Documentation Series, No. 1] project is a multiphased project, presently proceeding under the auspices of the International Atomic Energy Agency according to the terms of a four-party agreement among the European Atomic Energy Community (EC), the Government of Japan (JA), the Government of the Russian Federation (RF), and the Government of the United States (US), ``the Parties.'' The ITER project is based on the tokamak, a Russian invention, and has since been brought to a high level of development in all major fusion programs in the world. The objective of ITER is to demonstrate the scientific and technological feasibility of fusion energy for peaceful purposes. The ITER design is being developed, with support from the Parties' four Home Teams and is in progress by the Joint Central Team. An overview of ITER Design activities is presented.

  11. STATISTICAL ANALYSYS OF THE SCFE OF A BRAZILAN MINERAL COAL

    Directory of Open Access Journals (Sweden)

    DARIVA Cláudio

    1997-01-01

    Full Text Available The influence of some process variables on the productivity of the fractions (liquid yield times fraction percent obtained from SCFE of a Brazilian mineral coal using isopropanol and ethanol as primary solvents is analyzed using statistical techniques. A full factorial 23 experimental design was adopted to investigate the effects of process variables (temperature, pressure and cosolvent concentration on the extraction products. The extracts were analyzed by the Preparative Liquid Chromatography-8 fractions method (PLC-8, a reliable, non destructive solvent fractionation method, especially developed for coal-derived liquids. Empirical statistical modeling was carried out in order to reproduce the experimental data. Correlations obtained were always greater than 0.98. Four specific process criteria were used to allow process optimization. Results obtained show that it is not possible to maximize both extract productivity and purity (through the minimization of heavy fraction content simultaneously by manipulating the mentioned process variables.

  12. Experimental application of design principles in corrosion research

    International Nuclear Information System (INIS)

    Smyrl, W.H.; Pohlman, S.L.

    1977-01-01

    Experimental design criteria for corrosion investigations are based on established principles for systems that have uniform, or nearly uniform, corrosive attack. Scale-up or scale-down may be accomplished by proper use of dimensionless groups that measure the relative importance of interfacial kinetics, solution conductivity, and mass transfer. These principles have been applied to different fields of corrosion which include materials selection testing and protection; and to a specific corrosion problem involving attack of a substrate through holes in a protective overplate

  13. The effect on prospective teachers of the learning environment supported by dynamic statistics software

    Science.gov (United States)

    Koparan, Timur

    2016-02-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.

  14. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  15. An Experimental Design of Bypass Magneto-Rheological (MR) damper

    Science.gov (United States)

    Rashid, MM; Aziz, Mohammad Abdul; Raisuddin Khan, Md.

    2017-11-01

    The magnetorheological (MR) fluid bypass damper fluid flow through a bypass by utilizing an external channel which allows the controllability of MR fluid in the channel. The Bypass MR damper (BMRD) contains a rectangular bypass flow channel, current controlled movable piston shaft arrangement and MR fluid. The static piston coil case is winding by a coil which is used inside the piston head arrangement. The current controlled coil case provides a magnetic flux through the BMRD cylinder for controllability. The high strength of alloy steel materials are used for making piston shaft which allows magnetic flux propagation throughout the BMRD cylinder. Using the above design materials, a Bypass MR damper is designed and tested. An excitation of current is applied during the experiment which characterizes the BMRD controllability. It is shown that the BMRD with external flow channel allows a high controllable damping force using an excitation current. The experimental result of damping force-displacement characteristics with current excitation and without current excitation are compared in this research. The BMRD model is validated by the experimental result at various frequencies and applied excitation current.

  16. Flow cytometry: design, development and experimental validation

    International Nuclear Information System (INIS)

    Seigneur, Alain

    1987-01-01

    The flow cytometry techniques allow the analysis and sorting of living biologic cells at rates above five to ten thousand events per second. After a short review, we present in this report the design and development of a 'high-tech' apparatus intended for research laboratories and the experimental results. The first part deals with the physical principles allowing morphologic and functional analysis of cells or cellular components. The measured parameters are as follows: electrical resistance pulse sizing, light scattering and fluorescence. Hydrodynamic centering is used, and in the same way, the division of a water-stream into droplets leading to electrostatic sorting of particles. The second part deals with the apparatus designed by the 'Commissariat a l'Energie Atomique' (C.E.A.) and industrialised by 'ODAM' (ATC 3000). The last part of this thesis work is the performance evaluations of this cyto-meter. The difference between the two size measurement methods are analyzed: electrical resistance pulse sizing versus small-angle light scattering. By an original optics design, high sensitivity has been reached in the fluorescence measurement: the equivalent noise corresponds to six hundred fluorescein isothiocyanate (FITC) molecules. The sorting performances have also been analyzed and the cell viability proven. (author) [fr

  17. Accessibility and replacement as prime constraints in the design of large experimental tokamaks

    International Nuclear Information System (INIS)

    Challender, R.S.; Reynolds, P.

    1976-01-01

    An attempt is made to bring together those design features of large, experimental Tokamaks, which would lead to better accessibility during non-active operation and, in particular, permit replacement and repair after activation, thereby making possible an extended period of experimental operation into the ignition phase

  18. Experimental design approach to the process parameter optimization for laser welding of martensitic stainless steels in a constrained overlap configuration

    Science.gov (United States)

    Khan, M. M. A.; Romoli, L.; Fiaschi, M.; Dini, G.; Sarri, F.

    2011-02-01

    This paper presents an experimental design approach to process parameter optimization for the laser welding of martensitic AISI 416 and AISI 440FSe stainless steels in a constrained overlap configuration in which outer shell was 0.55 mm thick. To determine the optimal laser-welding parameters, a set of mathematical models were developed relating welding parameters to each of the weld characteristics. These were validated both statistically and experimentally. The quality criteria set for the weld to determine optimal parameters were the minimization of weld width and the maximization of weld penetration depth, resistance length and shearing force. Laser power and welding speed in the range 855-930 W and 4.50-4.65 m/min, respectively, with a fiber diameter of 300 μm were identified as the optimal set of process parameters. However, the laser power and welding speed can be reduced to 800-840 W and increased to 4.75-5.37 m/min, respectively, to obtain stronger and better welds.

  19. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  20. Improved microbial conversion of de-oiled Jatropha waste into biohydrogen via inoculum pretreatment: process optimization by experimental design approach

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Kumar

    2015-03-01

    Full Text Available In this study various pretreatment methods of sewage sludge inoculum and the statistical process optimization of de-oiled jatropha waste have been reported. Peak hydrogen production rate (HPR and hydrogen yield (HY of 0.36 L H2/L-d and 20 mL H2/g Volatile Solid (VS were obtained when heat shock pretreatment (95 oC, 30 min was employed. Afterwards, an experimental design was applied to find the optimal conditions for H2 production using heat-pretreated seed culture. The optimal substrate concentration, pH and temperature were determined by using response surface methodology as 205 g/L, 6.53 and 55.1 oC, respectively. Under these circumstances, the highest HPR of 1.36 L H2/L-d was predicted. Verification tests proved the reliability of the statistical approach. As a result of the heat pretreatment and fermentation optimization, a significant (~ 4 folds increase in HPR was achieved. PCR-DGGE results revealed that Clostridium sp. were majorly present under the optimal conditions.

  1. Mechanical design of the small-scale experimental ADS: MYRRHA

    Energy Technology Data Exchange (ETDEWEB)

    Maes, Dirk [SCKCEN, Reactor Physics and MYRRHA Department, Boeretang 200, B-2400 Mol (Belgium)

    2006-10-15

    Since 1998, SCK*CEN, in partnership with IBA s.a. and many European research laboratories, is designing a multipurpose Accelerator Driven System (ADS) - MYRRHA - and is conducting an associated R and D support programme. MYRRHA aims to serve as a basis for the European experimental ADS to provide protons and neutrons for various R and D applications. Besides an overall configuration of the MYRRHA reactor internals, the description in this paper is limited to the mechanical design of the main components of the Primary System and Associated Equipment (vessel and cover, diaphragm, spallation loop, sub-critical core, primary cooling system, emergency cooling system, in-vessel fuel storage and fuel transfer machine), the conceptual design of the robotics for In-Service Inspection and Repair (ISIR), together with the remote handling for operation and maintenance (O and M). (author)

  2. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  3. Learning physics: A comparative analysis between instructional design methods

    Science.gov (United States)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods

  4. Design, construction and testing of a radon experimental chamber; Diseno, construccion y pruebas de una camara experimental de radon

    Energy Technology Data Exchange (ETDEWEB)

    Chavez B, A; Balcazar G, M

    1991-10-15

    To carry out studies on the radon behavior under controlled and stable conditions it was designed and constructed a system that consists of two parts: a container of mineral rich in Uranium and an experimentation chamber with radon united one to the other one by a step valve. The container of uranium mineral approximately contains 800 gr of uranium with a law of 0.28%; the radon gas emanated by the mineral is contained tightly by the container. When the valve opens up the radon gas it spreads to the radon experimental chamber; this contains 3 accesses that allow to install different types of detectors. The versatility of the system is exemplified with two experiments: 1. With the radon experimental chamber and an associated spectroscopic system, the radon and two of its decay products are identified. 2. The design of the system allows to couple the mineral container to other experimental geometries to demonstrate this fact it was coupled and proved a new automatic exchanger system of passive detectors of radon. The results of the new automatic exchanger system when it leave to flow the radon freely among the container and the automatic exchanger through a plastic membrane of 15 m. are shown. (Author)

  5. Statistically designed optimisation of enzyme catalysed starch removal from potato pulp

    DEFF Research Database (Denmark)

    Thomassen, Lise Vestergaard; Meyer, Anne S.

    2010-01-01

    to obtain dietary fibers is usually accomplished via a three step, sequential enzymatic treatment procedure using a heat stable alpha-amylase, protease, and amyloglucosidase. Statistically designed experiments were performed to investigate the influence of enzyme dose, amount of dry matter, incubation time...... and temperature on the amount of starch released from the potato pulp. The data demonstrated that all the starch could be released from potato pulp in one step when 8% (w/w) dry potato pulp was treated with 0.2% (v/w) (enzyme/substrate (E/S)) of a thermostable Bacillus licheniformis alpha-amylase (Termamyl(R) SC...

  6. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    Science.gov (United States)

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical

  7. Selecting and Improving Quasi-Experimental Designs in Effectiveness and Implementation Research.

    Science.gov (United States)

    Handley, Margaret A; Lyles, Courtney R; McCulloch, Charles; Cattamanchi, Adithya

    2018-04-01

    Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.

  8. Experimental design and optimization of raloxifene hydrochloride loaded nanotransfersomes for transdermal application

    Directory of Open Access Journals (Sweden)

    Mahmood S

    2014-09-01

    Full Text Available Syed Mahmood, Muhammad Taher, Uttam Kumar Mandal Department of Pharmaceutical Technology, Kulliyyah of Pharmacy, International Islamic University Malaysia (IIUM, Pahang Darul Makmur, Malaysia Abstract: Raloxifene hydrochloride, a highly effective drug for the treatment of invasive breast cancer and osteoporosis in post-menopausal women, shows poor oral bioavailability of 2%. The aim of this study was to develop, statistically optimize, and characterize raloxifene hydrochloride-loaded transfersomes for transdermal delivery, in order to overcome the poor bioavailability issue with the drug. A response surface methodology experimental design was applied for the optimization of transfersomes, using Box-Behnken experimental design. Phospholipon® 90G, sodium deoxycholate, and sonication time, each at three levels, were selected as independent variables, while entrapment efficiency, vesicle size, and transdermal flux were identified as dependent variables. The formulation was characterized by surface morphology and shape, particle size, and zeta potential. Ex vivo transdermal flux was determined using a Hanson diffusion cell assembly, with rat skin as a barrier medium. Transfersomes from the optimized formulation were found to have spherical, unilamellar structures, with a ­homogeneous distribution and low polydispersity index (0.08. They had a particle size of 134±9 nM, with an entrapment efficiency of 91.00%±4.90%, and transdermal flux of 6.5±1.1 µg/cm2/hour. Raloxifene hydrochloride-loaded transfersomes proved significantly superior in terms of amount of drug permeated and deposited in the skin, with enhancement ratios of 6.25±1.50 and 9.25±2.40, respectively, when compared with drug-loaded conventional liposomes, and an ethanolic phosphate buffer saline. Differential scanning calorimetry study revealed a greater change in skin structure, compared with a control sample, during the ex vivo drug diffusion study. Further, confocal laser

  9. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    Science.gov (United States)

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will

  10. Superconducting coil design for a tokamak experimental power reactor

    International Nuclear Information System (INIS)

    Turner, L.R.; Wang, S.T.; Smelser, P.

    1977-01-01

    Superconducting toroidal field (TF) and polodial-field (PF) coils have been designed for the proposed Argonne National Laboratory experimental power reactor (EPR). Features of the design include: (1) Peak field of 8 T at 4.2 K or 10 T at 3.0 K. (2) Constant-tension shape for the TF coils, corrected for the finite number (16) of coils. (3) Analysis of errors in coil alignment. (4) Comparison of safety aspects of series-connected and parallel-connected coils. (5) A 60 kA sheet conductor of NbTi with copper stabilizer and stainless steel for support. (6) Superconducting PF coils outside the TF coils. (7) The TF coils shielded from pulsed fields by high-purity aluminum

  11. Tasks related to increase of RA reactor exploitation and experimental potential, 03. Crane for handling the vertical experimental channels of the RA reactor - design project

    International Nuclear Information System (INIS)

    Pavicevic, M.

    1963-07-01

    Within the work related to improvement of experimental potential of the RA reactor, this document describes the design project of the new crane for handling the vertical experimental channels of the RA reactor, engineering drawings of the crane main elements, mechanical part, design project of the electrical part of the crane and cost estimation

  12. Conceptual design study of fusion experimental reactor (FY86 FER)

    International Nuclear Information System (INIS)

    Saito, Ryusei; Kashihara, Shin-ichiro; Itoh, Shin-ichi

    1987-08-01

    This report describes the results of conceptual design study on plant systems for the Fusion Experimental Reactor (FY86 FER). Design studies for FER plant systems have been continued from FY85, especially for design modifications made in accordance with revisions of plasma scaling parameters and system improvements. This report describes 1) system construction, 2) site and reactor building plan, 3) repaire and maintenance system, 4) tritium circulation system, 5) heating, ventilation and air conditioning system, 6) tritium clean-up system, 7) cooling and baking system, 8) waste treatment and storage system, 9) control system, 10) electric power system, 11) site factory plan, all of which are a part of FY86 design work. The plant systems described in this report generally have been based on the FY86 FER (ACS Reactor) which is an one of the six candidates for FER. (author)

  13. Ethanol Production from Kitchen Garbage Using Zymomonas mobilis: Optimization of Parameters through Statistical Experimental Designs

    OpenAIRE

    Ma, H.; Wang, Q.; Gong, L.; Wang, X.; Yin, W.

    2008-01-01

    Plackett-Burman design was employed to screen 8 parameters for ethanol production from kitchen garbage by Zymomonas mobilis in simultaneous saccharification and fermentation. The parameters were divided into two parts, four kinds of enzymes and supplementation nutrients. The result indicated that the nutrient inside kitchen garbage could meet the requirement of ethanol production without supplementation, only protease and glucoamylase were needed to accelerate the ethanol production. The opti...

  14. Inferential statistics, power estimates, and study design formalities continue to suppress biomedical innovation

    OpenAIRE

    Kern, Scott E.

    2014-01-01

    Innovation is the direct intended product of certain styles in research, but not of others. Fundamental conflicts between descriptive vs inferential statistics, deductive vs inductive hypothesis testing, and exploratory vs pre-planned confirmatory research designs have been played out over decades, with winners and losers and consequences. Longstanding warnings from both academics and research-funding interests have failed to influence effectively the course of these battles. The NIH publicly...

  15. General description of preliminary design of an experimental fusion reactor and the future problems

    International Nuclear Information System (INIS)

    Sako, Kiyoshi

    1976-01-01

    Recently, the studies on plasma physics has progressed rapidly, and promising experimental data emerged successively. Especially expectation mounts high that Tokamak will develop into power reactors. In Japan, the construction of large plasma devices such as JT-60 of JAERI is going to start, and after several years, the studies on plasma physics will come to the end of first stage, then the main research and development will be directed to power reactors. The studies on the design of practical fusion reactors have been in progress since 1973 in JAERI, and the preliminary design is being carried out. The purposes of the preliminary design are the clarification of the concept of the experimental reactor and the requirements for the studies on core plasma, the examination of the problems for developing main components and systems of the reactor, and the development of design technology. The experimental reactor is the quasi-steady reactor of 100 MW fusion reaction output, and the conditions set for the design and the basis of their setting are explained. The outline of the design, namely core plasma, blankets, superconductive magnets and the shielding with them, vacuum wall, neutral particle injection heating device, core fuel supply and exhaust system, and others, is described. In case of scale-up the reactor structural material which can withstand neutron damage must be developed. (Kako, I.)

  16. Experimental Methods for the Analysis of Optimization Algorithms

    DEFF Research Database (Denmark)

    of solution quality, runtime and other measures; and the third part collects advanced methods from experimental design for configuring and tuning algorithms on a specific class of instances with the goal of using the least amount of experimentation. The contributor list includes leading scientists......, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different...

  17. Mirrors design, analysis and manufacturing of the 550mm Korsch telescope experimental model

    Science.gov (United States)

    Huang, Po-Hsuan; Huang, Yi-Kai; Ling, Jer

    2017-08-01

    In 2015, NSPO (National Space Organization) began to develop the sub-meter resolution optical remote sensing instrument of the next generation optical remote sensing satellite which follow-on to FORMOSAT-5. Upgraded from the Ritchey-Chrétien Cassegrain telescope optical system of FORMOSAT-5, the experimental optical system of the advanced optical remote sensing instrument was enhanced to an off-axis Korsch telescope optical system which consists of five mirrors. It contains: (1) M1: 550mm diameter aperture primary mirror, (2) M2: secondary mirror, (3) M3: off-axis tertiary mirror, (4) FM1 and FM2: two folding flat mirrors, for purpose of limiting the overall volume, reducing the mass, and providing a long focal length and excellent optical performance. By the end of 2015, we implemented several important techniques including optical system design, opto-mechanical design, FEM and multi-physics analysis and optimization system in order to do a preliminary study and begin to develop and design these large-size lightweight aspheric mirrors and flat mirrors. The lightweight mirror design and opto-mechanical interface design were completed in August 2016. We then manufactured and polished these experimental model mirrors in Taiwan; all five mirrors ware completed as spherical surfaces by the end of 2016. Aspheric figuring, assembling tests and optical alignment verification of these mirrors will be done with a Korsch telescope experimental structure model in 2018.

  18. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki

    2015-05-12

    Experimental design can be vital when experiments are resource-exhaustive and time-consuming. In this work, we carry out experimental design in the Bayesian framework. To measure the amount of information that can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data about the model parameters. One of the major difficulties in evaluating the expected information gain is that it naturally involves nested integration over a possibly high dimensional domain. We use the Multilevel Monte Carlo (MLMC) method to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, MLMC can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the MLMC method imposes fewer assumptions, such as the asymptotic concentration of posterior measures, required for instance by the Laplace approximation (LA). We test the MLMC method using two numerical examples. The first example is the design of sensor deployment for a Darcy flow problem governed by a one-dimensional Poisson equation. We place the sensors in the locations where the pressure is measured, and we model the conductivity field as a piecewise constant random vector with two parameters. The second one is chemical Enhanced Oil Recovery (EOR) core flooding experiment assuming homogeneous permeability. We measure the cumulative oil recovery, from a horizontal core flooded by water, surfactant and polymer, for different injection rates. The model parameters consist of the endpoint relative permeabilities, the residual saturations and the relative permeability exponents for the three phases: water, oil and

  19. Experimental observations of Lagrangian sand grain kinematics under bedload transport: statistical description of the step and rest regimes

    Science.gov (United States)

    Guala, M.; Liu, M.

    2017-12-01

    The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.

  20. Design studies of back up cores for the experimental multi-purpose VHTR, (1)

    International Nuclear Information System (INIS)

    Yasuno, Takehiko; Miyamoto, Yoshiaki; Mitake, Susumu

    1982-09-01

    For the Experimental Multi-Purpose Very High Temperature Reactor, design studies have been made of two backup cores loaded with new type fuel elements. The purpose is to improve core operational characteristics of the standard design core (Mark-III core) consisting of pin-in-block type fuel element having externally cooled hollow fuel rods. The first backup core (semi-pin fuel core) is composed of fuel elements with internally cooled fuel pins, and the second core (multihole fuel core) is composed of multihole fuel elements, which can be adopted for the experimental VHTR as the substitution of the standard Mark-III fuel element. Either of the cores has 73 fuel columns and 4 m height. The arrangement of active core and reactor internal structure is same as that in the standard design core. These backup cores meet almost all design requirements of the VHTR and increase the margins for some important design items in comparison with the standard core (Mark-III core). This report describes the overall characteristics of nuclear, thermal-hydraulic, fuel and safety, and structural consideration for these cores. (author)

  1. Pre-design stage of the intermediate heat exchanger for experimental fast reactor

    International Nuclear Information System (INIS)

    Luz, M.; Borges, E.M.; Braz Filho, F.A.; Hirdes, V.R.

    1986-09-01

    This report presents the outlines of a thermal-hydraulic calculation procedure for the pre-design stage of the Intermediate Heat Exchanger for a 5 MW Experimental Fast Reactor (EFR), which can be used in other similar projects, at the same stage of evolution. Heat transfer and heat loss computations for the preliminary design of the heat exchanger are presented. (author) [pt

  2. Exploring multi-metal biosorption by indigenous metal-hyperresistant Enterobacter sp. J1 using experimental design methodologies

    International Nuclear Information System (INIS)

    Lu, W.-B.; Kao, W.-C.; Shi, J.-J.; Chang, J.-S.

    2008-01-01

    A novel experimental design, combining mixture design and response surface methodology (RSM), was developed to investigate the competitive adsorption behavior of lead, copper and cadmium by an indigenous isolate Enterobacter sp. J1 able to tolerate high concentrations of a variety of heavy metals. Using the proposed combinative experimental design, two different experiment designs in a ternary metal biosorption system can be integrated to a succinct experiment and the number of experimental trials was markedly reduced from 38 to 26 by reusing the mutual experimental data. Triangular contour diagrams and triangular three-dimensional surface plots were generated to describe the ternary metal biosorption equilibrium data in mixture design systems. The results show that the preference of metal sorption of Enterobacter sp. J1 decreased in the order of Pb 2+ > Cu 2+ > Cd 2+ . The presence of other metals resulted in a competitive effect. The influence of the other two metals in ternary metal biosorption system can be easily determined by comparing the stray distance from the single metal biosorption. The behavior of competitive biosorption was successfully described and predicted using a combined Langmuir-Freundlich model along with new three-dimensional contour-surface plots

  3. Exploring multi-metal biosorption by indigenous metal-hyperresistant Enterobacter sp. J1 using experimental design methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lu, W.-B. [Department of Cosmetic Science, Chung Hwa University of Medical Technology, Tainan, Taiwan (China); Kao, W.-C.; Shi, J.-J. [Department of Chemical Engineering, National Cheng Kung University, Tainan, Taiwan (China); Chang, J.-S. [Department of Chemical Engineering, National Cheng Kung University, Tainan, Taiwan (China)], E-mail: changjs@mail.ncku.edu.tw

    2008-05-01

    A novel experimental design, combining mixture design and response surface methodology (RSM), was developed to investigate the competitive adsorption behavior of lead, copper and cadmium by an indigenous isolate Enterobacter sp. J1 able to tolerate high concentrations of a variety of heavy metals. Using the proposed combinative experimental design, two different experiment designs in a ternary metal biosorption system can be integrated to a succinct experiment and the number of experimental trials was markedly reduced from 38 to 26 by reusing the mutual experimental data. Triangular contour diagrams and triangular three-dimensional surface plots were generated to describe the ternary metal biosorption equilibrium data in mixture design systems. The results show that the preference of metal sorption of Enterobacter sp. J1 decreased in the order of Pb{sup 2+} > Cu{sup 2+} > Cd{sup 2+}. The presence of other metals resulted in a competitive effect. The influence of the other two metals in ternary metal biosorption system can be easily determined by comparing the stray distance from the single metal biosorption. The behavior of competitive biosorption was successfully described and predicted using a combined Langmuir-Freundlich model along with new three-dimensional contour-surface plots.

  4. Experimental design optimisation: theory and application to estimation of receptor model parameters using dynamic positron emission tomography

    International Nuclear Information System (INIS)

    Delforge, J.; Syrota, A.; Mazoyer, B.M.

    1989-01-01

    General framework and various criteria for experimental design optimisation are presented. The methodology is applied to estimation of receptor-ligand reaction model parameters with dynamic positron emission tomography data. The possibility of improving parameter estimation using a new experimental design combining an injection of the β + -labelled ligand and an injection of the cold ligand is investigated. Numerical simulations predict remarkable improvement in the accuracy of parameter estimates with this new experimental design and particularly the possibility of separate estimations of the association constant (k +1 ) and of receptor density (B' max ) in a single experiment. Simulation predictions are validated using experimental PET data in which parameter uncertainties are reduced by factors ranging from 17 to 1000. (author)

  5. Tritium system design studies of fusion experimental breeder

    International Nuclear Information System (INIS)

    Deng Baiquan; Huang Jinhua

    2003-01-01

    A summary of the tritium system design studies for the engineering outline design of a fusion experimental breeder (FEB-E) is presented. This paper is divided into three sections. In first section, the geometry, loading features and tritium concentrations in liquid lithium of tritium breeding zones of blanket are described. The tritium flow chart corresponding to the tritium fuel cycle system has been constructed, and the inventories in ten subsystems are calculated using SWITRIM code in section 2. Results show that the necessary initial tritium storage to start up FEB-E with fusion power of 143 MW is about 319 g. In final section, the tritium leakage issues under different operation circumstances have been analyzed. It was found that the potential danger of tritium leakage could be resulted from the exhausted gas of the diverter system. It is important to elevate the tritium burnup fraction and reduce the tritium throughput. (authors)

  6. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  7. A survey of Wien bridge-based chaotic oscillators: Design and experimental issues

    International Nuclear Information System (INIS)

    Kilic, Recai; Yildirim, Fatma

    2008-01-01

    This paper presents a comparative study on design and implementation of Wien type chaotic oscillators. By making a collection of almost all Wien bridge-based chaotic circuits, we have investigated these oscillators in terms of chaotic dynamics, circuit structures, active building blocks, nonlinear element structures and operating frequency by using PSpice simulations and laboratory experiments. In addition to this comparative investigation, we present our two basic experimental contributions to referred implementations. While the first of our experimental contributions consists of the experimentally implementation of CFOA-based Chua's circuit modified for very high chaotic oscillations, the scope of the second is to experimentally implement a Wien type high frequency chaos generator, which has the diode-inductor composite, in the inductorless form by using CFOA-based synthetic inductor

  8. Developing Statistical Knowledge for Teaching during Design-Based Research

    Science.gov (United States)

    Groth, Randall E.

    2017-01-01

    Statistical knowledge for teaching is not precisely equivalent to statistics subject matter knowledge. Teachers must know how to make statistics understandable to others as well as understand the subject matter themselves. This dual demand on teachers calls for the development of viable teacher education models. This paper offers one such model,…

  9. Design and experimental tests of a novel neutron spin analyzer for wide angle spin echo spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Fouquet, Peter; Farago, Bela; Andersen, Ken H.; Bentley, Phillip M.; Pastrello, Gilles; Sutton, Iain; Thaveron, Eric; Thomas, Frederic [Institut Laue-Langevin, BP 156, F-38042 Grenoble Cedex 9 (France); Moskvin, Evgeny [Helmholtzzentrum Berlin, Glienicker Strasse 100, D-14109 Berlin (Germany); Pappas, Catherine [Helmholtzzentrum Berlin, Glienicker Strasse 100, D-14109 Berlin (Germany); Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands)

    2009-09-15

    This paper describes the design and experimental tests of a novel neutron spin analyzer optimized for wide angle spin echo spectrometers. The new design is based on nonremanent magnetic supermirrors, which are magnetized by vertical magnetic fields created by NdFeB high field permanent magnets. The solution presented here gives stable performance at moderate costs in contrast to designs invoking remanent supermirrors. In the experimental part of this paper we demonstrate that the new design performs well in terms of polarization, transmission, and that high quality neutron spin echo spectra can be measured.

  10. Experimental determination of new statistical correlations for the calculation of the heat transfer coefficient by convection for flat plates, cylinders and tube banks

    Directory of Open Access Journals (Sweden)

    Ismael Fernando Meza Castro

    2017-07-01

    Full Text Available Introduction: This project carried out an experimental research with the design, assembly, and commissioning of a convection heat transfer test bench. Objective: To determine new statistical correlations that allow knowing the heat transfer coefficients by air convection with greater accuracy in applications with different heating geometry configurations. Methodology: Three geometric configurations, such as flat plate, cylinders and tube banks were studied according to their physical properties through Reynolds and Prandtl numbers, using a data transmission interface using Arduino® controllers Measured the air temperature through the duct to obtain real-time data and to relate the heat transferred from the heating element to the fluid and to perform mathematical modeling in specialized statistical software. The study was made for the three geometries mentioned, one power per heating element and two air velocities with 10 repetitions. Results: Three mathematical correlations were obtained with regression coefficients greater than 0.972, one for each heating element, obtaining prediction errors in the heat transfer convective coefficients of 7.50% for the flat plate, 2.85% for the plate Cylindrical and 1.57% for the tube bank. Conclusions: It was observed that in geometries constituted by several individual elements, a much more accurate statistical adjustment was obtained to predict the behavior of the convection heat coefficients, since each unit reaches a stability in the surface temperature profile with Greater speed, giving the geometry in general, a more precise measurement of the parameters that govern the transfer of heat, as it is in the case of the geometry of the tube bank.

  11. Remote maintenance design for Fusion Experimental Reactor (FER)

    International Nuclear Information System (INIS)

    Tachikawa, K.; Iida, H.; Nishio, S.; Tone, T.; Aota, T.; Iwamoto, T.; Niikura, S.; Nishizawa, H.

    1984-01-01

    Design of Fusion Experimental Reactor, FER, has been conducted by Japan Atomic Energy Research Institute (JAERI) since 1981. Two typical reactors can be classified in general from the viewpoints of remote maintenance among four design concepts of FER. In the case of the type 1 FER, the torus module consists of shield structure and blanket, and the connective joints between toruses provided at the outer region of the reactor. As for the type 2 FER, the shield structure is joined with the vacuum cryostat, and only the blanket module is allowed to move, but connection between toruses are located in the inner region of the reactor. Comparing type 1 with type 2 FER, this paper describes on the remote maintenance of FER including reactor configurations, work procedures, remote systems/equipments, repairing facility and future R and D problems. Reviewing design studies and investigation for the existing robotics technologies, R and D for FER remote maintenance technology should be performed under the reasonable long-term program. The main items of remote technology required to start urgently are multi-purpose manipulator system with performance of dextrousity, tele-viewing system which reduces operator fatigue and remote tests for commercially available components

  12. Issues and recent advances in optimal experimental design for site investigation (Invited)

    Science.gov (United States)

    Nowak, W.

    2013-12-01

    This presentation provides an overview over issues and recent advances in model-based experimental design for site exploration. The addressed issues and advances are (1) how to provide an adequate envelope to prior uncertainty, (2) how to define the information needs in a task-oriented manner, (3) how to measure the expected impact of a data set that it not yet available but only planned to be collected, and (4) how to perform best the optimization of the data collection plan. Among other shortcomings of the state-of-the-art, it is identified that there is a lack of demonstrator studies where exploration schemes based on expert judgment are compared to exploration schemes obtained by optimal experimental design. Such studies will be necessary do address the often voiced concern that experimental design is an academic exercise with little improvement potential over the well- trained gut feeling of field experts. When addressing this concern, a specific focus has to be given to uncertainty in model structure, parameterizations and parameter values, and to related surprises that data often bring about in field studies, but never in synthetic-data based studies. The background of this concern is that, initially, conceptual uncertainty may be so large that surprises are the rule rather than the exception. In such situations, field experts have a large body of experience in handling the surprises, and expert judgment may be good enough compared to meticulous optimization based on a model that is about to be falsified by the incoming data. In order to meet surprises accordingly and adapt to them, there needs to be a sufficient representation of conceptual uncertainty within the models used. Also, it is useless to optimize an entire design under this initial range of uncertainty. Thus, the goal setting of the optimization should include the objective to reduce conceptual uncertainty. A possible way out is to upgrade experimental design theory towards real-time interaction

  13. Empirical evidence of bias in the design of experimental stroke studies - A metaepidemiologic approach

    NARCIS (Netherlands)

    Crossley, Nicolas A.; Sena, Emily; Goehler, Jos; Horn, Jannekke; van der Worp, Bart; Bath, Philip M. W.; Macleod, Malcolm; Dirnagl, Ulrich

    2008-01-01

    Background and Purpose - At least part of the failure in the transition from experimental to clinical studies in stroke has been attributed to the imprecision introduced by problems in the design of experimental stroke studies. Using a metaepidemiologic approach, we addressed the effect of

  14. Optimal experimental design in an epidermal growth factor receptor signalling and down-regulation model.

    Science.gov (United States)

    Casey, F P; Baird, D; Feng, Q; Gutenkunst, R N; Waterfall, J J; Myers, C R; Brown, K S; Cerione, R A; Sethna, J P

    2007-05-01

    We apply the methods of optimal experimental design to a differential equation model for epidermal growth factor receptor signalling, trafficking and down-regulation. The model incorporates the role of a recently discovered protein complex made up of the E3 ubiquitin ligase, Cbl, the guanine exchange factor (GEF), Cool-1 (beta -Pix) and the Rho family G protein Cdc42. The complex has been suggested to be important in disrupting receptor down-regulation. We demonstrate that the model interactions can accurately reproduce the experimental observations, that they can be used to make predictions with accompanying uncertainties, and that we can apply ideas of optimal experimental design to suggest new experiments that reduce the uncertainty on unmeasurable components of the system.

  15. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  16. Experimental system design for the integration of trapped-ion and superconducting qubit systems

    Science.gov (United States)

    De Motte, D.; Grounds, A. R.; Rehák, M.; Rodriguez Blanco, A.; Lekitsch, B.; Giri, G. S.; Neilinger, P.; Oelsner, G.; Il'ichev, E.; Grajcar, M.; Hensinger, W. K.

    2016-12-01

    We present a design for the experimental integration of ion trapping and superconducting qubit systems as a step towards the realization of a quantum hybrid system. The scheme addresses two key difficulties in realizing such a system: a combined microfabricated ion trap and superconducting qubit architecture, and the experimental infrastructure to facilitate both technologies. Developing upon work by Kielpinski et al. (Phys Rev Lett 108(13):130504, 2012. doi: 10.1103/PhysRevLett.108.130504), we describe the design, simulation and fabrication process for a microfabricated ion trap capable of coupling an ion to a superconducting microwave LC circuit with a coupling strength in the tens of kHz. We also describe existing difficulties in combining the experimental infrastructure of an ion trapping set-up into a dilution refrigerator with superconducting qubits and present solutions that can be immediately implemented using current technology.

  17. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis...

  18. Combined application of mixture experimental design and artificial neural networks in the solid dispersion development.

    Science.gov (United States)

    Medarević, Djordje P; Kleinebudde, Peter; Djuriš, Jelena; Djurić, Zorica; Ibrić, Svetlana

    2016-01-01

    This study for the first time demonstrates combined application of mixture experimental design and artificial neural networks (ANNs) in the solid dispersions (SDs) development. Ternary carbamazepine-Soluplus®-poloxamer 188 SDs were prepared by solvent casting method to improve carbamazepine dissolution rate. The influence of the composition of prepared SDs on carbamazepine dissolution rate was evaluated using d-optimal mixture experimental design and multilayer perceptron ANNs. Physicochemical characterization proved the presence of the most stable carbamazepine polymorph III within the SD matrix. Ternary carbamazepine-Soluplus®-poloxamer 188 SDs significantly improved carbamazepine dissolution rate compared to pure drug. Models developed by ANNs and mixture experimental design well described the relationship between proportions of SD components and percentage of carbamazepine released after 10 (Q10) and 20 (Q20) min, wherein ANN model exhibit better predictability on test data set. Proportions of carbamazepine and poloxamer 188 exhibited the highest influence on carbamazepine release rate. The highest carbamazepine release rate was observed for SDs with the lowest proportions of carbamazepine and the highest proportions of poloxamer 188. ANNs and mixture experimental design can be used as powerful data modeling tools in the systematic development of SDs. Taking into account advantages and disadvantages of both techniques, their combined application should be encouraged.

  19. Contribution of experimental fluid mechanics to the design of vertical slot fish passes

    Directory of Open Access Journals (Sweden)

    Wang R.W.

    2010-02-01

    Full Text Available This paper presents the main results of an experimental study of mean and turbulent characteristics of flow in a scale model of a vertical slot fish pass with varying width and slope (from 5% to 15%. Experimental hydraulic modelling was combined with the study of fish behaviour in the model. The discharge coefficient, which significantly affects the design of such facilities, varied from 0.67 to 0.89 and was strongly influenced by the slope. Two distinct flow patterns were observed, depending on the slope and the fish pass width. The point of transition between the two states was determined. Low velocity areas are likely resting zones for fish and particular attention was paid to evaluating these areas. Slope was found to affect both the volume of the low velocity zone, and the value of turbulent kinetic energy in these areas. The statistical characteristics of turbulent kinetic energy in the pools were linked primarily to the maximum velocity in the jet. An analysis of the behaviour of juvenile brown trout (Salmo trutta in the scale model clearly showed that the fish avoided the areas of high velocities in the jet, except at the slot itself where they took advantage of the jet’s non-stationary character. Low-velocity areas were not frequented uniformly by fish, which stayed most frequently in the zone located just downstream from the slot and behind the small side baffle. It is suggested that future studies might investigate lower pool-length to slot-width ratios, which might make it possible to increase the slope significantly and should also examine ways of improving hydraulic conditions for fish by carefully distributing obstacles in pools.

  20. The Effects of Reaction Variables on Solution Polymerization of Vinyl Acetate and Molecular Weight of Poly(vinyl alcohol Using Taguchi Experimental Design

    Directory of Open Access Journals (Sweden)

    M.H. Navarchian

    2009-12-01

    Full Text Available Poly(vinyl acetate is synthesized via solution polymerization, and then it is converted to poly(vinyl alcohol by alkaline alcoholysis. The aim of the work study was to investigate statistically the  influence of reaction variables in vinyl acetate polymerization, the conversion of this monomer to polymer, degree of branching of acetyl group in poly(vinyl acetate, as well as the molecular weight of poly(vinyl alcohol, using Taguchi experimental design approach. The reaction variables were polymerization time, molar ratio of initiator to monomer, and volume ratio of monomer to solvent. The statistical analysis of variance of the results revealed that all factors have significantly influenced the conversion and degree of branching. Volume ratio of monomer to solvent is the only factor affecting the molecular weight of poly(vinyl alcohol, and has the greatest influence on all responses. By increasing this ratio, the conversion, degree of branching of acetyl group in poly(vinyl acetate, and molecular weight of poly(vinyl alcohol were increased.

  1. New synthetic thrombin inhibitors: molecular design and experimental verification.

    Directory of Open Access Journals (Sweden)

    Elena I Sinauridze

    Full Text Available BACKGROUND: The development of new anticoagulants is an important goal for the improvement of thromboses treatments. OBJECTIVES: The design, synthesis and experimental testing of new safe and effective small molecule direct thrombin inhibitors for intravenous administration. METHODS: Computer-aided molecular design of new thrombin inhibitors was performed using our original docking program SOL, which is based on the genetic algorithm of global energy minimization in the framework of a Merck Molecular Force Field. This program takes into account the effects of solvent. The designed molecules with the best scoring functions (calculated binding energies were synthesized and their thrombin inhibitory activity evaluated experimentally in vitro using a chromogenic substrate in a buffer system and using a thrombin generation test in isolated plasma and in vivo using the newly developed model of hemodilution-induced hypercoagulation in rats. The acute toxicities of the most promising new thrombin inhibitors were evaluated in mice, and their stabilities in aqueous solutions were measured. RESULTS: New compounds that are both effective direct thrombin inhibitors (the best K(I was 1111.1 mg/kg. A plasma-substituting solution supplemented with one of the new inhibitors prevented hypercoagulation in the rat model of hemodilution-induced hypercoagulation. Activities of the best new inhibitors in physiological saline (1 µM solutions were stable after sterilization by autoclaving, and the inhibitors remained stable at long-term storage over more than 1.5 years at room temperature and at 4°C. CONCLUSIONS: The high efficacy, stability and low acute toxicity reveal that the inhibitors that were developed may be promising for potential medical applications.

  2. design and experimental study of a solar system for heating water ...

    African Journals Online (AJOL)

    M. Ghodbane, B. Boumeddane, N. Said

    2016-09-01

    Sep 1, 2016 ... This work presents a design and an experimental study of a linear Fresnel reflector solar with trapezoidal cavity. ... concentrator in the solar fields allocated to the domestics and industrial water-heaters. Keywords: ...... integrated photovoltaic panels, Journal of Solar Energy Engineering, Transactions of the ...

  3. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-08

    In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target reaction rate. We show that the expected information gain surface can change its shape dramatically according to the level of noise introduced into the synthetic data. The information that can be extracted from the data saturates as a logarithmic function of the number of experiments, and few experiments are needed when they are conducted at the optimal experimental design conditions.

  4. Design and implementation of a modular program system for the carrying-through of statistical analyses

    International Nuclear Information System (INIS)

    Beck, W.

    1984-01-01

    From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de

  5. Experimental and mathematical modeling methods for the investigation of toxicological interactions

    International Nuclear Information System (INIS)

    El-Masri, Hisham A.

    2007-01-01

    While procedures have been developed and used for many years to assess risk and determine acceptable exposure levels to individual chemicals, most cases of environmental contamination can result in concurrent or sequential exposure to more than one chemical. Toxicological predictions of such combinations must be based on an understanding of the mechanisms of action and interaction of the components of the mixtures. Statistical and experimental methods test the existence of toxicological interactions in a mixture. However, these methods are limited to experimental data ranges for which they are derived, in addition to limitations caused by response differences from experimental animals to humans. Empirical methods such as isobolograms, median-effect principle and response surface methodology (RSM) are based on statistical experimental design and regression of data. For that reason, the predicted response surfaces can be used for extrapolation across dose regions where interaction mechanisms are not anticipated to change. In general, using these methods for predictions can be problematic without including biologically based mechanistic descriptions that can account for dose and species differences. Mechanistically based models, such as physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) models, include explicit descriptions of interaction mechanisms which are related to target tissues levels. These models include dose-dependent mechanistic hypotheses of toxicological interactions which can be tested by model-directed experimental design and used to identify dose regions where interactions are not significant

  6. A survey of Wien bridge-based chaotic oscillators: Design and experimental issues

    Energy Technology Data Exchange (ETDEWEB)

    Kilic, Recai [Erciyes University, Department of Electrical and Electronic Engineering, 38039 Kayseri (Turkey)], E-mail: kilic@erciyes.edu.tr; Yildirim, Fatma [Erciyes University, Civil Aviation School, 38039 Kayseri (Turkey)

    2008-12-15

    This paper presents a comparative study on design and implementation of Wien type chaotic oscillators. By making a collection of almost all Wien bridge-based chaotic circuits, we have investigated these oscillators in terms of chaotic dynamics, circuit structures, active building blocks, nonlinear element structures and operating frequency by using PSpice simulations and laboratory experiments. In addition to this comparative investigation, we present our two basic experimental contributions to referred implementations. While the first of our experimental contributions consists of the experimentally implementation of CFOA-based Chua's circuit modified for very high chaotic oscillations, the scope of the second is to experimentally implement a Wien type high frequency chaos generator, which has the diode-inductor composite, in the inductorless form by using CFOA-based synthetic inductor.

  7. Conceptual design study of fusion experimental reactor (FY86 FER)

    International Nuclear Information System (INIS)

    Kobayashi, Takeshi; Yamada, Masao; Mizoguchi, Tadanori

    1987-09-01

    This report describes the results of the reactor configuration/structure design for the fusion experimental reactor (FER) performed in FY 1986. The design was intended to meet the physical and engineering mission of the next step device which was decided by the subcommittee on the next step device of the nuclear fusion council. The objectives of the design study in FY 1986 are to advance and optimize the design concept of the last year because the recommendation of the subcommittee was basically the same as the design philosophy of the last year. Six candidate reactor configurations which correspond to options C ∼ D presented by the subcommittee were extensively examined. Consequently, ACS reactor (Advanced Option-C with Single Null Divertor) was selected as the reference configuration from viewpoints of technical risks and cost performance. Regarding the reactor structure, the following items were investigated intensively: minimization of reactor size, protection of first wall against plasma disruption, simplification of shield structure, reactor configuration which enables optimum arrangement of poloidal field coils. (author)

  8. modelling of directed evolution: Implications for experimental design and stepwise evolution

    OpenAIRE

    Wedge , David C.; Rowe , William; Kell , Douglas B.; Knowles , Joshua

    2009-01-01

    In silico modelling of directed evolution: Implications for experimental design and stepwise evolution correspondence: Corresponding author. Tel.: +441613065145. (Wedge, David C.) (Wedge, David C.) Manchester Interdisciplinary Biocentre, University of Manchester - 131 Princess Street--> , Manchester--> , M1 7ND--> - UNITED KINGDOM (Wedge, David C.) UNITED KINGDOM (Wedge, David C.) Man...

  9. Design study of toroidal magnets for tokamak experimental power reactors

    International Nuclear Information System (INIS)

    Stekly, Z.J.J.; Lucas, E.J.

    1976-12-01

    This report contains the results of a six-month study of superconducting toroidal field coils for a Tokamak Experimental Power Reactor to be built in the late 1980s. The designs are for 8 T and 12 T maximum magnetic field at the superconducting winding. At each field level two main concepts were generated; one in which each of the 16 coils comprising the system has an individual vacuum vessel and the other in which all the coils are contained in a single vacuum vessel. The coils have a D shape and have openings of 11.25 m x 7.5 m for the 8 T coils and 10.2 m x 6.8 m for the 12 T coils. All the designs utilize rectangular cabled conductor made from copper stabilized Niobium Titanium composite which operates at 4.2 K for the 8 T design and at 2.5 K for the 12 T design. Manufacturing procedures, processes and schedule estimates are also discussed

  10. Sparse linear models: Variational approximate inference and Bayesian experimental design

    International Nuclear Information System (INIS)

    Seeger, Matthias W

    2009-01-01

    A wide range of problems such as signal reconstruction, denoising, source separation, feature selection, and graphical model search are addressed today by posterior maximization for linear models with sparsity-favouring prior distributions. The Bayesian posterior contains useful information far beyond its mode, which can be used to drive methods for sampling optimization (active learning), feature relevance ranking, or hyperparameter estimation, if only this representation of uncertainty can be approximated in a tractable manner. In this paper, we review recent results for variational sparse inference, and show that they share underlying computational primitives. We discuss how sampling optimization can be implemented as sequential Bayesian experimental design. While there has been tremendous recent activity to develop sparse estimation, little attendance has been given to sparse approximate inference. In this paper, we argue that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms. Moreover, since some variational inference methods have been given strong convex optimization characterizations recently, theoretical analysis may become possible, promising new insights into nonlinear experimental design.

  11. Sparse linear models: Variational approximate inference and Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, Matthias W [Saarland University and Max Planck Institute for Informatics, Campus E1.4, 66123 Saarbruecken (Germany)

    2009-12-01

    A wide range of problems such as signal reconstruction, denoising, source separation, feature selection, and graphical model search are addressed today by posterior maximization for linear models with sparsity-favouring prior distributions. The Bayesian posterior contains useful information far beyond its mode, which can be used to drive methods for sampling optimization (active learning), feature relevance ranking, or hyperparameter estimation, if only this representation of uncertainty can be approximated in a tractable manner. In this paper, we review recent results for variational sparse inference, and show that they share underlying computational primitives. We discuss how sampling optimization can be implemented as sequential Bayesian experimental design. While there has been tremendous recent activity to develop sparse estimation, little attendance has been given to sparse approximate inference. In this paper, we argue that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms. Moreover, since some variational inference methods have been given strong convex optimization characterizations recently, theoretical analysis may become possible, promising new insights into nonlinear experimental design.

  12. Delineamento experimental e tamanho de amostra para alface cultivada em hidroponia Experimental design and sample size for hydroponic lettuce crop

    Directory of Open Access Journals (Sweden)

    Valéria Schimitz Marodim

    2000-10-01

    Full Text Available Este estudo visa a estabelecer o delineamento experimental e o tamanho de amostra para a cultura da alface (Lactuca sativa em hidroponia, pelo sistema NFT (Nutrient film technique. O experimento foi conduzido no Laboratório de Cultivos Sem Solo/Hidroponia, no Departamento de Fitotecnia da Universidade Federal de Santa Maria e baseou-se em dados de massa de plantas. Os resultados obtidos mostraram que, usando estrutura de cultivo de alface em hidroponia sobre bancadas de fibrocimento com seis canais, o delineamento experimental adequado é blocos ao acaso se a unidade experimental for constituída de faixas transversais aos canais das bancadas, e deve ser inteiramente casualizado se a bancada for a unidade experimental; para a variável massa de plantas, o tamanho da amostra é de 40 plantas para uma semi-amplitude do intervalo de confiança em percentagem da média (d igual a 5% e de 7 plantas para um d igual a 20%.This study was carried out to establish the experimental design and sample size for hydroponic lettuce (Lactuca sativa crop under nutrient film technique. The experiment was conducted in the Laboratory of Hydroponic Crops of the Horticulture Department of the Federal University of Santa Maria. The evaluated traits were plant weight. Under hydroponic conditions on concrete bench with six ducts, the most indicated experimental design for lettuce is randomised blocks for duct transversal plots or completely randomised for bench plot. The sample size for plant weight should be 40 and 7 plants, respectively, for a confidence interval of mean percentage (d equal to 5% and 20%.

  13. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  14. Microrandomized trials: An experimental design for developing just-in-time adaptive interventions.

    Science.gov (United States)

    Klasnja, Predrag; Hekler, Eric B; Shiffman, Saul; Boruvka, Audrey; Almirall, Daniel; Tewari, Ambuj; Murphy, Susan A

    2015-12-01

    This article presents an experimental design, the microrandomized trial, developed to support optimization of just-in-time adaptive interventions (JITAIs). JITAIs are mHealth technologies that aim to deliver the right intervention components at the right times and locations to optimally support individuals' health behaviors. Microrandomized trials offer a way to optimize such interventions by enabling modeling of causal effects and time-varying effect moderation for individual intervention components within a JITAI. The article describes the microrandomized trial design, enumerates research questions that this experimental design can help answer, and provides an overview of the data analyses that can be used to assess the causal effects of studied intervention components and investigate time-varying moderation of those effects. Microrandomized trials enable causal modeling of proximal effects of the randomized intervention components and assessment of time-varying moderation of those effects. Microrandomized trials can help researchers understand whether their interventions are having intended effects, when and for whom they are effective, and what factors moderate the interventions' effects, enabling creation of more effective JITAIs. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  15. Assessment of wood liquefaction in acidified ethylene glycol using experimental design methodology

    Energy Technology Data Exchange (ETDEWEB)

    Rezzoug, S.A. [Universite de La Rochelle, Lab. de Maitrise des Technologies Agro-Industrielles, La Rochelle, 17 (France); Capart, R. [Universite de Technologie de Compiegne, Dept. de Genie Chimique, Compiegne, 60 (France)

    2003-03-01

    The liquefaction of milled wood (Pinus pinaster) was effected in ethylene glycol acidified with small quantities of H{sub 2}SO{sub 4} as catalyst. The purpose of this paper is to evaluate the influence upon the liquefaction yield of the three operating variables, the maximal temperature (150-280 deg C), the reaction time at maximal temperature (20-60 min) and the amount of added H{sub 2}SO{sub 4} (0-1.5% on dry wood). The individual effects, as well as the interactions between operating variables, are investigated by using an experimental design methodology. From a Pareto chart, it appears that the most significant effects are clearly those of the maximum temperature and the interaction between acidity and temperature. Such effects can be graphically verified through response surfaces and contour line plots. From a regression analysis, the conversion rate of wood into liquid is simply expressed as a function of the operating variables by a polynomial containing quadratic terms. A statistical model seems particularly appropriate in the case of complex and multi-components, as wood, a kinetic model is nevertheless proposed for the liquefaction of micro-crystalline cellulose. This model accounts for the formation of a carbonaceous solid residue from the liquid product. Such an unwanted phenomenon obviously results in a lower yield in liquid product. (Author)

  16. Indium recovery from acidic aqueous solutions by solvent extraction with D2EHPA: a statistical approach to the experimental design

    Directory of Open Access Journals (Sweden)

    Fortes M.C.B.

    2003-01-01

    Full Text Available This experimental work presents the optimization results of obtaining a high indium concentration solution and minimum iron poisoning by solvent extraction with D2EHPA solubilized in isoparaffin and exxsol. The variables studied in the extraction step were D2EHPA concentration, acidity of the aqueous phase and time of contact between phases. Different hydrochloric and sulfuric acid concentrations were studied for the stripping step. The optimum experimental conditions resulted in a solution with 99% indium extraction and less than 4% iron. The construction of a McCabe-Thiele diagram indicated two theoretical countercurrent stages for indium extraction and at least six stages for indium stripping. Finally, the influence of associated metals found in typical sulfate leach liquors from zinc plants was studied. Under the experimental conditions for maximum indium extraction, 96% indium extraction was obtained, iron extraction was about 4% and no Ga, Cu and Zn were co-extracted.

  17. Design Methodology and Experimental Verification of Serpentine/Folded Waveguide TWTs

    Science.gov (United States)

    2016-03-17

    FW), oscillation, serpentine, stopband, traveling -wave tube (TWT), vacuum electronics. I. INTRODUCTION DEVELOPMENT of high-power broadband vacuum elec...tron devices (VEDs) beyond Ka-band using conventional coupled-cavity and helix traveling -wave tube (TWT) RF cir- cuit fabrication techniques is...bottom plot. III. G-BAND CIRCUIT DESIGN AND EXPERIMENTAL VALIDATION The primary motivation for the G-band amplifier was to develop a high-power broadband

  18. Statistics is not enough: revisiting Ronald A. Fisher's critique (1936) of Mendel's experimental results (1866).

    Science.gov (United States)

    Pilpel, Avital

    2007-09-01

    This paper is concerned with the role of rational belief change theory in the philosophical understanding of experimental error. Today, philosophers seek insight about error in the investigation of specific experiments, rather than in general theories. Nevertheless, rational belief change theory adds to our understanding of just such cases: R. A. Fisher's criticism of Mendel's experiments being a case in point. After an historical introduction, the main part of this paper investigates Fisher's paper from the point of view of rational belief change theory: what changes of belief about Mendel's experiment does Fisher go through and with what justification. It leads to surprising insights about what Fisher had done right and wrong, and, more generally, about the limits of statistical methods in detecting error.

  19. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  20. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  1. A new experimental design method to optimize formulations focusing on a lubricant for hydrophilic matrix tablets.

    Science.gov (United States)

    Choi, Du Hyung; Shin, Sangmun; Khoa Viet Truong, Nguyen; Jeong, Seong Hoon

    2012-09-01

    A robust experimental design method was developed with the well-established response surface methodology and time series modeling to facilitate the formulation development process with magnesium stearate incorporated into hydrophilic matrix tablets. Two directional analyses and a time-oriented model were utilized to optimize the experimental responses. Evaluations of tablet gelation and drug release were conducted with two factors x₁ and x₂: one was a formulation factor (the amount of magnesium stearate) and the other was a processing factor (mixing time), respectively. Moreover, different batch sizes (100 and 500 tablet batches) were also evaluated to investigate an effect of batch size. The selected input control factors were arranged in a mixture simplex lattice design with 13 experimental runs. The obtained optimal settings of magnesium stearate for gelation were 0.46 g, 2.76 min (mixing time) for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The optimal settings for drug release were 0.33 g, 7.99 min for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The exact ratio and mixing time of magnesium stearate could be formulated according to the resulting hydrophilic matrix tablet properties. The newly designed experimental method provided very useful information for characterizing significant factors and hence to obtain optimum formulations allowing for a systematic and reliable experimental design method.

  2. Test for the statistical significance of differences between ROC curves

    International Nuclear Information System (INIS)

    Metz, C.E.; Kronman, H.B.

    1979-01-01

    A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions

  3. Lidar measurements of plume statistics

    DEFF Research Database (Denmark)

    Ejsing Jørgensen, Hans; Mikkelsen, T.

    1993-01-01

    of measured crosswind concentration profiles, the following statistics were obtained: 1) Mean profile, 2) Root mean square profile, 3) Fluctuation intensities,and 4)Intermittency factors. Furthermore, some experimentally determined probability density functions (pdf's) of the fluctuations are presented. All...... the measured statistics are referred to a fixed and a 'moving' frame of reference, the latter being defined as a frame of reference from which the (low frequency) plume meander is removed. Finally, the measured statistics are compared with statistics on concentration fluctuations obtained with a simple puff...

  4. New synthetic thrombin inhibitors: molecular design and experimental verification.

    Science.gov (United States)

    Sinauridze, Elena I; Romanov, Alexey N; Gribkova, Irina V; Kondakova, Olga A; Surov, Stepan S; Gorbatenko, Aleksander S; Butylin, Andrey A; Monakov, Mikhail Yu; Bogolyubov, Alexey A; Kuznetsov, Yuryi V; Sulimov, Vladimir B; Ataullakhanov, Fazoyl I

    2011-01-01

    The development of new anticoagulants is an important goal for the improvement of thromboses treatments. The design, synthesis and experimental testing of new safe and effective small molecule direct thrombin inhibitors for intravenous administration. Computer-aided molecular design of new thrombin inhibitors was performed using our original docking program SOL, which is based on the genetic algorithm of global energy minimization in the framework of a Merck Molecular Force Field. This program takes into account the effects of solvent. The designed molecules with the best scoring functions (calculated binding energies) were synthesized and their thrombin inhibitory activity evaluated experimentally in vitro using a chromogenic substrate in a buffer system and using a thrombin generation test in isolated plasma and in vivo using the newly developed model of hemodilution-induced hypercoagulation in rats. The acute toxicities of the most promising new thrombin inhibitors were evaluated in mice, and their stabilities in aqueous solutions were measured. New compounds that are both effective direct thrombin inhibitors (the best K(I) was 50) in the thrombin generation assay of approximately 100 nM) were discovered. These compounds contain one of the following new residues as the basic fragment: isothiuronium, 4-aminopyridinium, or 2-aminothiazolinium. LD(50) values for the best new inhibitors ranged from 166.7 to >1111.1 mg/kg. A plasma-substituting solution supplemented with one of the new inhibitors prevented hypercoagulation in the rat model of hemodilution-induced hypercoagulation. Activities of the best new inhibitors in physiological saline (1 µM solutions) were stable after sterilization by autoclaving, and the inhibitors remained stable at long-term storage over more than 1.5 years at room temperature and at 4°C. The high efficacy, stability and low acute toxicity reveal that the inhibitors that were developed may be promising for potential medical applications.

  5. Experimental fusion power reactor conceptual design study. Final report. Volume III

    International Nuclear Information System (INIS)

    Baker, C.C.

    1976-12-01

    This document is the final report which describes the work carried out by General Atomic Company for the Electric Power Research Institute on a conceptual design study of a fusion experimental power reactor (EPR) and an overall EPR facility. The primary objective of the two-year program was to develop a conceptual design of an EPR that operates at ignition and produces continuous net power. A conceptual design was developed for a Doublet configuration based on indications that a noncircular tokamak offers the best potential of achieving a sufficiently high effective fuel containment to provide a viable reactor concept at reasonable cost. Other objectives included the development of a planning cost estimate and schedule for the plant and the identification of critical R and D programs required to support the physics development and engineering and construction of the EPR. This volume contains the following appendices: (1) tradeoff code analysis, (2) residual mode transport, (3) blanket/first wall design evaluations, (4) shielding design evaluation, (5) toroidal coil design evaluation, (6) E-coil design evaluation, (7) F-coil design evaluation, (8) plasma recycle system design evaluation, (9) primary coolant purification design evaluation, (10) power supply system design evaluation, (11) number of coolant loops, (12) power conversion system design evaluation, and (13) maintenance methods evaluation

  6. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  7. Structural and compositional features of high-rise buildings: experimental design in Yekaterinburg

    Science.gov (United States)

    Yankovskaya, Yulia; Lobanov, Yuriy; Temnov, Vladimir

    2018-03-01

    The study looks at the specifics of high-rise development in Yekaterinburg. High-rise buildings are considered in the context of their historical development, structural features, compositional and imaginative design techniques. Experience of Yekaterinburg architects in experimental design is considered and analyzed. Main issues and prospects of high-rise development within the Yekaterinburg structure are studied. The most interesting and significant conceptual approaches to the structural and compositional arrangement of high-rise buildings are discussed.

  8. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  9. The participation of the Experimental Design Factory of the Uranium Industry of Czechoslovakia in the design of a tunneling machine with disk bits

    Energy Technology Data Exchange (ETDEWEB)

    Kastner, P

    1983-01-01

    A tunneling machine, two prototypes of which were designed and built jointly on the basis of scientific and technical cooperation between the Experimental Design Factory of the Uranium Industry of Czechoslovakia and the VEB-Schachtbau enterprise (East Germany), is described. The experimental design operations were conducted under the methodological leadership of the Mine Construction in the Uranium Industry (Czechoslovakia) enterprise. The experimental design factory developed a general design system for the machine and its individual subassemblies. The detailed technical documentation for the machine units was developed by both enterprises. Each enterprise made two complexes of specific units and spare parts. The prototypes were assembled in both countries with the technical assistance of the producer enterprise of the appropriate subassembly. Industrial tests were conducted by each enterprise independently with technical assistance and delivery of spare parts on the part of the producer enterprise. A machine under the title of VM 24-27 was used to drill more than 2,300 meters of water supply tunnel in East Germany in 1982 and a machine called the RS 24-27 (29) was used in Prague in the same year to drill approximately 1,400 meters of cable collectors. The machine is designed for the passage of rounded mine drifts with a diameter of 2.4 to 2.7 (2.9) meters) to the full cross section in stable rocks. Its overall length is 32.5 meters, while the total weight is 85 tons. The shift productivity was 9.55 meters. Since 1979 the Mining Construction in the Uranium Industry and the Experimental Design Plant of the Uranium Industry Enterprises of Czechoslovakia have supplied disk bits for the TVM Demag tunnel drilling machines (West Germany) and RS 24-27 and the HG 210 Wirth (West Germany) cross cut drills.

  10. Robustness testing, using experimental design, of a flow-through dissolution method for a product where the actives have markedly differing solubility properties.

    Science.gov (United States)

    Bloomfield, M S; Butler, W C

    2000-09-25

    The use of experimental design for the robustness testing of a flow-through dissolution method (Ph Eur/USP Apparatus 4) for atovaquone, one of the drug substances in a dual-active anti-malarial tablet formulation, Malarone tablets, is described. This procedure was developed to overcome the suppression of the atovaquone solubility, caused by the presence of the co-drug proguanil hydrochloride and potential imprecision due to the poor solubility of the coating material in the basic dissolution media employed. For this testing a quarter fractional two-level factorial design was applied, assessing six factors in sixteen experiments, with a further six centre points to assess natural experimental variation. Results demonstrate that the method is robust to small changes in all the main factors evaluated at sample times of 30 min or greater. At 15 min, variations in the concentration of sodium hydroxide in the dissolution media, peristaltic pump speed and flow rate were assessed as statistically significant. This observation is a result of the initial steepness of the dissolution release curve and hence these factors are now controlled routinely in the method. Release of this poorly soluble drug is limited at the 45 min time point (Q=75%) according to pharmacopoeial guidelines. The approach may be applied for other dissolution procedures.

  11. Digital learning material for experimental design and model building in molecular biology

    NARCIS (Netherlands)

    Aegerter-Wilmsen, T.

    2005-01-01

    Designing experimental approaches is a major cognitive skill in molecular biology research, and building models, including quantitative ones, is a cognitive skill which is rapidly gaining importance. Since molecular biology education at university level is aimed at educating future researchers, we

  12. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  13. The Statistical Analysis of Relation between Compressive and Tensile/Flexural Strength of High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Kępniak M.

    2016-12-01

    Full Text Available This paper addresses the tensile and flexural strength of HPC (high performance concrete. The aim of the paper is to analyse the efficiency of models proposed in different codes. In particular, three design procedures from: the ACI 318 [1], Eurocode 2 [2] and the Model Code 2010 [3] are considered. The associations between design tensile strength of concrete obtained from these three codes and compressive strength are compared with experimental results of tensile strength and flexural strength by statistical tools. Experimental results of tensile strength were obtained in the splitting test. Based on this comparison, conclusions are drawn according to the fit between the design methods and the test data. The comparison shows that tensile strength and flexural strength of HPC depend on more influential factors and not only compressive strength.

  14. Preparation of methacrylic acid-modified rice husk improved by an experimental design and application for paraquat adsorption

    International Nuclear Information System (INIS)

    Hsu, Shih-Tong; Chen, Lung-Chuan; Lee, Cheng-Chieh; Pan, Ting-Chung; You, Bing-Xuan; Yan, Qi-Feng

    2009-01-01

    Methacrylic acid (MAA) grafted rice husk was synthesized using graft copolymerization with Fenton's reagent as the redox initiator and applied to the adsorption of paraquat. The highest grafting percentage of 44.3% was obtained using the traditional kinetic method. However, a maximum grafting percentage of 65.3% was calculated using the central composite design. Experimental results based on the recipes predicted from the statistical analysis are consistent with theoretical calculations. A representative polymethacrylic acid-g-rice husk (PMAA-g-rice husk) copolymer was hydrolyzed to a salt type and applied to the adsorption of paraquat. The adsorption equilibrium data correlate more closely with the Langmuir isotherm than with the Freundlich equation. The maximum adsorption capacity of modified rice husk is 292.5 mg/g-adsorbent. This value exceeds those for Fuller's earth and activated carbon, which are the most common binding agents used for paraquat. The samples at various stages were characterized by solid-state 13 C NMR spectroscopy.

  15. Preparation of methacrylic acid-modified rice husk improved by an experimental design and application for paraquat adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, Shih-Tong, E-mail: shihtong@mail.ksu.edu.tw [Department of Polymer Materials, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City, Tainan Hsien, Taiwan (China); Chen, Lung-Chuan, E-mail: lcchen@mail.ksu.edu.tw [Department of Polymer Materials, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City, Tainan Hsien, Taiwan (China); Lee, Cheng-Chieh, E-mail: etmediagoing@yahoo.com.tw [Department of Environmental Engineering, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City 710, Tainan Hsien, Taiwan (China); Pan, Ting-Chung, E-mail: tcpan@mail.ksu.edutw [Department of Environmental Engineering, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City 710, Tainan Hsien, Taiwan (China); You, Bing-Xuan, E-mail: kp2681@yahoo.com.tw [Department of Polymer Materials, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City, Tainan Hsien, Taiwan (China); Yan, Qi-Feng, E-mail: rsrs0938@yahoo.com.tw [Department of Polymer Materials, Kun Shan University, No. 949 Da-Wan Rd., Yung-Kang City, Tainan Hsien, Taiwan (China)

    2009-11-15

    Methacrylic acid (MAA) grafted rice husk was synthesized using graft copolymerization with Fenton's reagent as the redox initiator and applied to the adsorption of paraquat. The highest grafting percentage of 44.3% was obtained using the traditional kinetic method. However, a maximum grafting percentage of 65.3% was calculated using the central composite design. Experimental results based on the recipes predicted from the statistical analysis are consistent with theoretical calculations. A representative polymethacrylic acid-g-rice husk (PMAA-g-rice husk) copolymer was hydrolyzed to a salt type and applied to the adsorption of paraquat. The adsorption equilibrium data correlate more closely with the Langmuir isotherm than with the Freundlich equation. The maximum adsorption capacity of modified rice husk is 292.5 mg/g-adsorbent. This value exceeds those for Fuller's earth and activated carbon, which are the most common binding agents used for paraquat. The samples at various stages were characterized by solid-state {sup 13}C NMR spectroscopy.

  16. Preparation of methacrylic acid-modified rice husk improved by an experimental design and application for paraquat adsorption.

    Science.gov (United States)

    Hsu, Shih-Tong; Chen, Lung-Chuan; Lee, Cheng-Chieh; Pan, Ting-Chung; You, Bing-Xuan; Yan, Qi-Feng

    2009-11-15

    Methacrylic acid (MAA) grafted rice husk was synthesized using graft copolymerization with Fenton's reagent as the redox initiator and applied to the adsorption of paraquat. The highest grafting percentage of 44.3% was obtained using the traditional kinetic method. However, a maximum grafting percentage of 65.3% was calculated using the central composite design. Experimental results based on the recipes predicted from the statistical analysis are consistent with theoretical calculations. A representative polymethacrylic acid-g-rice husk (PMAA-g-rice husk) copolymer was hydrolyzed to a salt type and applied to the adsorption of paraquat. The adsorption equilibrium data correlate more closely with the Langmuir isotherm than with the Freundlich equation. The maximum adsorption capacity of modified rice husk is 292.5mg/g-adsorbent. This value exceeds those for Fuller's earth and activated carbon, which are the most common binding agents used for paraquat. The samples at various stages were characterized by solid-state (13)C NMR spectroscopy.

  17. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  18. Design and experimental investigation of a Multi-segment plate concentrated photovoltaic solar energy system

    International Nuclear Information System (INIS)

    Wang, Gang; Chen, Zeshao; Hu, Peng

    2017-01-01

    Highlights: • A multi-segment plate concentrated photovoltaic solar energy system was proposed. • A prototype of this new concentrator was developed for experimental investigation. • Experimental investigation results showed a good concentrating uniformity. - Abstract: Solar energy is one of the most promising renewable energies and meaningful for the sustainable development of energy source. A multi-segment plate concentrated photovoltaic (CPV) solar power system was proposed in this paper, the design principle of the multi-segment plate concentrator of this solar power system was given, which could provide uniform solar radiation flux density distribution on solar cells. A prototype of this multi-segment plate CPV solar power system was developed for the experimental study, aiming at the investigations of solar radiation flux density distribution and PV performances under this concentrator design. The experimental results showed that the solar radiation flux density distribution provided by the multi-segment plate concentrator had a good uniformity, and the number and temperature of solar cells both influence the photoelectric transformation efficiency of the CPV solar power system.

  19. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  20. An experimental method for designing the municipal solid waste biodrying

    International Nuclear Information System (INIS)

    Rada, E.C.; Politecnico Univ., Bucarest; Franzinelli, A.; Taiss, M.; Ragazzi, M.; Panaitescu, V.; Apostol, T.

    2005-01-01

    In the management of Municipal Solid Waste (MSW), in agreement with the new European directives concerning the valorization of materials and energy recovery, a recent approach based on a one-stream Biological Mechanical Treatment (BMT) is spreading as an alternative to the traditional two-stream approach. The bio-mechanical treatment of MSW is an increasing option either as a pre-treatment before land filling or as a pre-treatment before combustion. In the present paper an experimental method for designing the Municipal Solid Waste bio-drying is proposed. That means this paper deals with the option of energy recovery. The aim is to provide design criteria for bio-drying plants independent from the patents available in the sector [it

  1. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    Science.gov (United States)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  2. Experimental system design of liquid lithium-lead alloy bubbler for DFLL-TBM

    International Nuclear Information System (INIS)

    Xie Bo; Li Junge; Xu Shaomei; Weng Kuiping

    2011-01-01

    The liquid lithium-lead alloy bubbler is a very important composition in the tritium unit of Chinese Dual-Functional Lithium Lead Test Blanket Module (DFLL-TBM). In order to complete the construction and run of the bubbler experimental system,overall design of the system, main circuit design and auxiliary system design have been proposed on the basis of theoretical calculations for the interaction of hydrogen isotope with lithium-lead alloy and experiment for hydrogen extraction from liquid lithium-lead alloy by bubbling with rotational jet nozzle. The key of this design is gas-liquid exchange packed column, to achieve the measurement and extraction of hydrogen isotopes from liquid lithium-lead alloy. (authors)

  3. Design of experiments for test of fuel element reliability

    International Nuclear Information System (INIS)

    Boehmert, J.; Juettner, C.; Linek, J.

    1989-01-01

    Changes of fuel element design and modifications of the operational conditions have to be tested in experiments and pilot projects for nuclear safety. Experimental design is an useful statistical method minimizing costs and risks for this procedure. The main problem of our work was to investigate the connection between failure rate of fuel elements, sample size, confidence interval, and error probability. Using the statistic model of the binomial distribution appropriate relations were derived and discussed. A stepwise procedure based on a modified sequential analysis according to Wald was developed as a strategy of introduction for modifications of the fuel element design and of the operational conditions. (author)

  4. Effect of non-normality on test statistics for one-way independent groups designs.

    Science.gov (United States)

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  5. Lifetime statistics of quantum chaos studied by a multiscale analysis

    KAUST Repository

    Di Falco, A.

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  6. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  7. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  8. Design and implementation of new design of numerical experiments for non linear models

    International Nuclear Information System (INIS)

    Gazut, St.

    2007-03-01

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  9. Sensitivity analysis by experimental design and metamodelling : case study on simulation in national animal disease control

    NARCIS (Netherlands)

    Vonk Noordegraaf, A.; Nielen, M.; Kleijnen, J.P.C.

    2003-01-01

    Simulation is a frequently applied tool in the discipline of animal health economics. Application of sensitivity analysis, however, is often limited to changing only one factor at a time (OAT designs). In this study, the statistical techniques of Design of Experiments (DOE) and regression

  10. Increased performance in a bottom-up designed robot by experimentally guided redesign

    DEFF Research Database (Denmark)

    Larsen, Jørgen Christian

    2013-01-01

    Purpose – Using a bottom-up, model-free approach when building robots is often seen as a less scientific way, compared to a top-down model-based approach, because the results are not easily generalizable to other systems. The authors, however, hypothesize that this problem may be addressed by using...... the bottom-up, mode-free approach, the authors used the robotic construction kit, LocoKit. This construction kit allows researchers to construct legged robots, without having a mathematical model beforehand. The authors used no specific mathematical model to design the robot, but instead used intuition...... solid experimental methods. The purpose of this paper is to show how well-known experimental methods from bio-mechanics are used to measure and locate weaknesses in a bottom-up, model-free implementation of a quadruped walker and come up with a better solution. Design/methodology/approach – To study...

  11. Modeling of asphalt-rubber rotational viscosity by statistical analysis and neural networks

    Directory of Open Access Journals (Sweden)

    Luciano Pivoto Specht

    2007-03-01

    Full Text Available It is of a great importance to know binders' viscosity in order to perform handling, mixing, application processes and asphalt mixes compaction in highway surfacing. This paper presents the results of viscosity measurement in asphalt-rubber binders prepared in laboratory. The binders were prepared varying the rubber content, rubber particle size, duration and temperature of mixture, all following a statistical design plan. The statistical analysis and artificial neural networks were used to create mathematical models for prediction of the binders viscosity. The comparison between experimental data and simulated results with the generated models showed best performance of the neural networks analysis in contrast to the statistic models. The results indicated that the rubber content and duration of mixture have major influence on the observed viscosity for the considered interval of parameters variation.

  12. Signal processing and statistical analysis of spaced-based measurements

    International Nuclear Information System (INIS)

    Iranpour, K.

    1996-05-01

    The reports deals with data obtained by the ROSE rocket project. This project was designed to investigate the low altitude auroral instabilities in the electrojet region. The spectral and statistical analyses indicate the existence of unstable waves in the ionized gas in the region. An experimentally obtained dispersion relation for these waves were established. It was demonstrated that the characteristic phase velocities are much lower than what is expected from the standard theoretical results. This analysis of the ROSE data indicate the cascading of energy from lower to higher frequencies. 44 refs., 54 figs

  13. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  14. Design and application on experimental platform for high-speed bearing with grease lubrication

    Directory of Open Access Journals (Sweden)

    He Qiang

    2015-12-01

    Full Text Available The experimental platform for high-speed grease is an important tool for research and development of high-speed motorized spindle with grease lubrication. In this article, the experimental platform for high-speed grease is designed and manufactured which consists of the drive system, the test portion, the loading system, the lubrication system, the control system, and so on. In the meantime, the high-speed angular contact ceramic ball bearings B7005C/HQ1P4 as the research object are tested and contrasted in the grease lubrication and oil mist lubrication. The experimental platform performance is validated by contrast experiment, and the high-speed lubricated bearing performance is also studied especially in the relationship among the rotating speed,load and temperature rise. The results show that the experimental platform works steadily, accurate, and reliable in the experimental testing. And the grease lubrication ceramic ball bearings B7005C/HQ1P4 can be used in high-speed motorized spindle in the circular water cooling conditions when the rotating speed is lower than 40,000 r/min or the DN value (the value of the bearing diameter times the rotating speed is lower than the 1.44 × 106 mm r/min. Grease lubrication instead of oil mist lubrication under high-speed rotating will simplify the structure design of the high-speed motorized spindle and reduce the pollution to the environment.

  15. Conceptual design of superconducting magnet systems for the Argonne Tokamak Experimental Power Reactor

    International Nuclear Information System (INIS)

    Wang, S.T.; Turner, L.R.; Mills, F.E.; DeMichele, D.W.; Smelser, P.; Kim, S.H.

    1976-01-01

    As an integral effort in the Argonne Tokamak Experimental Power Reactor Conceptual Design, the conceptual design of a 10-tesla, pure-tension superconducting toroidal-field (TF) coil system has been developed in sufficient detail to define a realistic design for the TF coil system that could be built based upon the current state of technology with minimum technological extrapolations. A conceptual design study on the superconducting ohmic-heating (OH) coils and the superconducting equilibrium-field (EF) coils were also completed. These conceptual designs are developed in sufficient detail with clear information on high current ac conductor design, cooling, venting provision, coil structural support and zero loss poloidal coil cryostat design. Also investigated is the EF penetration into the blanket and shield

  16. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    Science.gov (United States)

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  17. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals

    Directory of Open Access Journals (Sweden)

    Parsons Nick R

    2012-04-01

    Full Text Available Abstract Background The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. Methods A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. Results The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10–26% of the studies investigated the conclusions were not clearly justified by the results, in 39% (30–49% of studies a different analysis should have been undertaken and in 17% (10–26% a different analysis could have made a difference to the overall conclusions. Conclusion It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  18. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform

    Directory of Open Access Journals (Sweden)

    Sara Taylor

    2018-04-01

    Full Text Available Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub, which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study (N = 13 to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing.

  19. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform.

    Science.gov (United States)

    Taylor, Sara; Sano, Akane; Ferguson, Craig; Mohan, Akshay; Picard, Rosalind W

    2018-04-05

    Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub), which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study ( N = 13) to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing.

  20. EXPERIMENTAL RESEARCH REGARDING LEATHER APPLICATIONS IN PRODUCT DESIGN

    Directory of Open Access Journals (Sweden)

    PRALEA Jeni

    2015-05-01

    Full Text Available This paper presents the role and importance of experimental research in design activity. The designer, as a researcher and a project manager, proposes to establish a relationship between functional-aesthetic-constructive-technological-economic,based on the aesthetic possibilities of the materials used for the experiments. With the aim to identify areas for the application of leather waste resulted from the production process, the paper presents experiments conducted with this material in combination with wood, by using different techniques that lead to different aesthetic effects. Identifying the areas to use and creating products from leather and/or wood waste, is based on the properties of these materials. Leather, the subject of these experiments, has the advantage that it can be used on both sides. Tactile differences of the two sides of this material has both aesthetical and functional advantages, which makes it suitable for applications on products that meet the requirements of "design for all". With differentiated tactile characteristics, in combination with other materials, for these experiments wood, easily "read touch" products can be generated to help people with certain disabilities. Thus, experiments presented in this paper allows the establishment of aesthetic schemes applicable to products that are friendly both with the environment (based on the reuse of wood and leather waste and with the users (can be used as applications, accessories and concepts of products for people with certain disabilities. The designer’s choices or decisions can be based on the results of this experiment. The experiment enables the designer to develop creative, innovative and environmentally friendly products.