WorldWideScience

Sample records for experimental design statistical

  1. Fundamentals of statistical experimental design and analysis

    CERN Document Server

    Easterling, Robert G

    2015-01-01

    Professionals in all areas - business; government; the physical, life, and social sciences; engineering; medicine, etc. - benefit from using statistical experimental design to better understand their worlds and then use that understanding to improve the products, processes, and programs they are responsible for. This book aims to provide the practitioners of tomorrow with a memorable, easy to read, engaging guide to statistics and experimental design. This book uses examples, drawn from a variety of established texts, and embeds them in a business or scientific context, seasoned with a dash of humor, to emphasize the issues and ideas that led to the experiment and the what-do-we-do-next? steps after the experiment. Graphical data displays are emphasized as means of discovery and communication and formulas are minimized, with a focus on interpreting the results that software produce. The role of subject-matter knowledge, and passion, is also illustrated. The examples do not require specialized knowledge, and t...

  2. Experimental toxicology: Issues of statistics, experimental design, and replication.

    Science.gov (United States)

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Experimental design matters for statistical analysis

    DEFF Research Database (Denmark)

    Jensen, Signe Marie; Schaarschmidt, Frank; Onofri, Andrea

    2018-01-01

    , the experimental design is often more or less neglected when analyzing data. Two data examples were analyzed using different modelling strategies: Firstly, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Secondly, translocation......BACKGROUND: Nowadays, the evaluation of effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately...... of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. RESULTS: It was shown that results from sub...

  4. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  5. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  6. Statistical evaluation of SAGE libraries: consequences for experimental design

    NARCIS (Netherlands)

    Ruijter, Jan M.; van Kampen, Antoine H. C.; Baas, Frank

    2002-01-01

    Since the introduction of serial analysis of gene expression (SAGE) as a method to quantitatively analyze the differential expression of genes, several statistical tests have been published for the pairwise comparison of SAGE libraries. Testing the difference between the number of specific tags

  7. Statistical Approaches in Analysis of Variance: from Random Arrangements to Latin Square Experimental Design

    OpenAIRE

    Radu E. SESTRAŞ; Lorentz JÄNTSCHI; Sorana D. BOLBOACĂ

    2009-01-01

    Background: The choices of experimental design as well as of statisticalanalysis are of huge importance in field experiments. These are necessary tobe correctly in order to obtain the best possible precision of the results. Therandom arrangements, randomized blocks and Latin square designs werereviewed and analyzed from the statistical perspective of error analysis.Material and Method: Random arrangements, randomized block and Latinsquares experimental designs were used as field experiments. ...

  8. Optimization of the synthesis of 1-allyloxy-2-hydroxy-propyl-starch through statistical experimental design

    NARCIS (Netherlands)

    Huijbrechts, A.M.L.; Vermonden, T.; Bogaert, P.; Franssen, M.C.R.; Visser, G.M.; Boeriu, C.G.; Sudhölter, E.J.R.

    2009-01-01

    The synthesis of 1-allyloxy-2-hydroxy-propyl starches was studied using a statistical experimental design approach. The etherification of two different granular maize starches with allyl glycidyl ether (AGE) in a heterogeneous alkaline suspension was investigated. The optimal reaction conditions

  9. Optimization of polyvinylidene fluoride (PVDF) membrane fabrication for protein binding using statistical experimental design.

    Science.gov (United States)

    Ahmad, A L; Ideris, N; Ooi, B S; Low, S C; Ismail, A

    2016-01-01

    Statistical experimental design was employed to optimize the preparation conditions of polyvinylidenefluoride (PVDF) membranes. Three variables considered were polymer concentration, dissolving temperature, and casting thickness, whereby the response variable was membrane-protein binding. The optimum preparation for the PVDF membrane was a polymer concentration of 16.55 wt%, a dissolving temperature of 27.5°C, and a casting thickness of 450 µm. The statistical model exhibits a deviation between the predicted and actual responses of less than 5%. Further characterization of the formed PVDF membrane showed that the morphology of the membrane was in line with the membrane-protein binding performance.

  10. Didanosine extended-release matrix tablets: optimization of formulation variables using statistical experimental design.

    Science.gov (United States)

    Sánchez-Lafuente, Carla; Furlanetto, Sandra; Fernández-Arévalo, Mercedes; Alvarez-Fuentes, Josefa; Rabasco, Antonio M; Faucci, M Teresa; Pinzauti, Sergio; Mura, Paola

    2002-04-26

    Statistical experimental design was applied to evaluate the influence of some process and formulation variables and possible interactions among such variables, on didanosine release from directly-compressed matrix tablets based on blends of two insoluble polymers, Eudragit RS-PM and Ethocel 100, with the final goal of drug release behavior optimization. The considered responses were the percent of drug released at three determined times, the dissolution efficiency at 6 h and the time to dissolve 10% of drug. Four independent variables were considered: tablet compression force, ratio between the polymers and their particle size, and drug content. The preliminary screening step, carried out by means of a 12-run asymmetric screening matrix according to a D-optimal design strategy, allowed evaluation of the effects of different levels of each variable. The drug content and the polymers ratio had the most important effect on drug release, which, moreover, was favored by greater polymers particle size; on the contrary the compression force did not have a significant effect. The Doehlert design was then applied for a response-surface study, in order to study in depth the effects of the most important variables. The desirability function was used to simultaneously optimize the five considered responses, each having a different target. This procedure allowed selection, in the studied experimental domain, of the best formulation conditions to optimize drug release rate. The experimental values obtained from the optimized formulation highly agreed with the predicted values. The results demonstrated the reliability of the model in the preparation of extended-release matrix tablets with predictable drug release profiles.

  11. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    Science.gov (United States)

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  12. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    Science.gov (United States)

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  13. Tribological behaviour and statistical experimental design of sintered iron-copper based composites

    Science.gov (United States)

    Popescu, Ileana Nicoleta; Ghiţă, Constantin; Bratu, Vasile; Palacios Navarro, Guillermo

    2013-11-01

    The sintered iron-copper based composites for automotive brake pads have a complex composite composition and should have good physical, mechanical and tribological characteristics. In this paper, we obtained frictional composites by Powder Metallurgy (P/M) technique and we have characterized them by microstructural and tribological point of view. The morphology of raw powders was determined by SEM and the surfaces of obtained sintered friction materials were analyzed by ESEM, EDS elemental and compo-images analyses. One lot of samples were tested on a "pin-on-disc" type wear machine under dry sliding conditions, at applied load between 3.5 and 11.5 × 10-1 MPa and 12.5 and 16.9 m/s relative speed in braking point at constant temperature. The other lot of samples were tested on an inertial test stand according to a methodology simulating the real conditions of dry friction, at a contact pressure of 2.5-3 MPa, at 300-1200 rpm. The most important characteristics required for sintered friction materials are high and stable friction coefficient during breaking and also, for high durability in service, must have: low wear, high corrosion resistance, high thermal conductivity, mechanical resistance and thermal stability at elevated temperature. Because of the tribological characteristics importance (wear rate and friction coefficient) of sintered iron-copper based composites, we predicted the tribological behaviour through statistical analysis. For the first lot of samples, the response variables Yi (represented by the wear rate and friction coefficient) have been correlated with x1 and x2 (the code value of applied load and relative speed in braking points, respectively) using a linear factorial design approach. We obtained brake friction materials with improved wear resistance characteristics and high and stable friction coefficients. It has been shown, through experimental data and obtained linear regression equations, that the sintered composites wear rate increases

  14. Experimental design and statistical rigor in phylogenomics of horizontal and endosymbiotic gene transfer

    Directory of Open Access Journals (Sweden)

    Stiller John W

    2011-09-01

    Full Text Available Abstract A growing number of phylogenomic investigations from diverse eukaryotes are examining conflicts among gene trees as evidence of horizontal gene transfer. If multiple foreign genes from the same eukaryotic lineage are found in a given genome, it is increasingly interpreted as concerted gene transfers during a cryptic endosymbiosis in the organism's evolutionary past, also known as "endosymbiotic gene transfer" or EGT. A number of provocative hypotheses of lost or serially replaced endosymbionts have been advanced; to date, however, these inferences largely have been post-hoc interpretations of genomic-wide conflicts among gene trees. With data sets as large and complex as eukaryotic genome sequences, it is critical to examine alternative explanations for intra-genome phylogenetic conflicts, particularly how much conflicting signal is expected from directional biases and statistical noise. The availability of genome-level data both permits and necessitates phylogenomics that test explicit, a priori predictions of horizontal gene transfer, using rigorous statistical methods and clearly defined experimental controls.

  15. Organic biowastes blend selection for composting industrial eggshell by-product: experimental and statistical mixture design.

    Science.gov (United States)

    Soares, Micaela A R; Andrade, Sandra R; Martins, Rui C; Quina, Margarida J; Quinta-Ferreira, Rosa M

    2012-01-01

    Composting is one of the technologies recommended for pre-treating industrial eggshells (ES) before its application in soils, for calcium recycling. However, due to the high inorganic content of ES, a mixture of biodegradable materials is required to assure a successful procedure. In this study, an adequate organic blend composition containing potato peel (PP), grass clippings (GC) and wheat straw (WS) was determined by applying the simplex-centroid mixture design method to achieve a desired moisture content, carbon: nitrogen ratio and free air space for effective composting of ES. A blend of 56% PP, 37% GC and 7% WS was selected and tested in a self heating reactor, where 10% (w/w) of ES was incorporated. After 29 days of reactor operation, a dry matter reduction of 46% was achieved and thermophilic temperatures were maintained during 15 days, indicating that the blend selected by statistical approach was adequate for composting of ES.

  16. Experimental design and statistical analysis for three-drug combination studies.

    Science.gov (United States)

    Fang, Hong-Bin; Chen, Xuerong; Pei, Xin-Yan; Grant, Steven; Tan, Ming

    2017-06-01

    Drug combination is a critically important therapeutic approach for complex diseases such as cancer and HIV due to its potential for efficacy at lower, less toxic doses and the need to move new therapies rapidly into clinical trials. One of the key issues is to identify which combinations are additive, synergistic, or antagonistic. While the value of multidrug combinations has been well recognized in the cancer research community, to our best knowledge, all existing experimental studies rely on fixing the dose of one drug to reduce the dimensionality, e.g. looking at pairwise two-drug combinations, a suboptimal design. Hence, there is an urgent need to develop experimental design and analysis methods for studying multidrug combinations directly. Because the complexity of the problem increases exponentially with the number of constituent drugs, there has been little progress in the development of methods for the design and analysis of high-dimensional drug combinations. In fact, contrary to common mathematical reasoning, the case of three-drug combinations is fundamentally more difficult than two-drug combinations. Apparently, finding doses of the combination, number of combinations, and replicates needed to detect departures from additivity depends on dose-response shapes of individual constituent drugs. Thus, different classes of drugs of different dose-response shapes need to be treated as a separate case. Our application and case studies develop dose finding and sample size method for detecting departures from additivity with several common (linear and log-linear) classes of single dose-response curves. Furthermore, utilizing the geometric features of the interaction index, we propose a nonparametric model to estimate the interaction index surface by B-spine approximation and derive its asymptotic properties. Utilizing the method, we designed and analyzed a combination study of three anticancer drugs, PD184, HA14-1, and CEP3891 inhibiting myeloma H929 cell line

  17. A statistical approach to the experimental design of the sulfuric acid leaching of gold-copper ore

    Directory of Open Access Journals (Sweden)

    Mendes F.D.

    2003-01-01

    Full Text Available The high grade of copper in the Igarapé Bahia (Brazil gold-copper ore prevents the direct application of the classic cyanidation process. Copper oxides and sulfides react with cyanides in solution, causing a high consumption of leach reagent and thereby raising processing costs and decreasing recovery of gold. Studies have showm that a feasible route for this ore would be a pretreatment for copper minerals removal prior to the cyanidation stage. The goal of this experimental work was to study the experimental conditions required for copper removal from Igarapé Bahia gold-copper ore by sulfuric acid leaching by applying a statistical approach to the experimental design. By using the Plackett Burman method, it was possible to select the variables that had the largest influence on the percentage of copper extracted at the sulfuric acid leaching stage. These were temperature of leach solution, stirring speed, concentration of sulfuric acid in the leach solution and particle size of the ore. The influence of the individual effects of these variables and their interactions on the experimental response were analyzed by applying the replicated full factorial design method. Finally, the selected variables were optimized by the ascending path statistical method, which determined the best experimental conditions for leaching to achieve the highest percentage of copper extracted. Using the optimized conditions, the best leaching results showed a copper extraction of 75.5%.

  18. Vitamin B12 production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii: optimization of medium composition through statistical experimental designs.

    Science.gov (United States)

    Kośmider, Alicja; Białas, Wojciech; Kubiak, Piotr; Drożdżyńska, Agnieszka; Czaczyk, Katarzyna

    2012-02-01

    A two-step statistical experimental design was employed to optimize the medium for vitamin B(12) production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii. In the first step, using Plackett-Burman design, five of 13 tested medium components (calcium pantothenate, NaH(2)PO(4)·2H(2)O, casein hydrolysate, glycerol and FeSO(4)·7H(2)O) were identified as factors having significant influence on vitamin production. In the second step, a central composite design was used to optimize levels of medium components selected in the first step. Valid statistical models describing the influence of significant factors on vitamin B(12) production were established for each optimization phase. The optimized medium provided a 93% increase in final vitamin concentration compared to the original medium. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  20. A statistical experimental design to remove sulfate by crystallization in a fluidized-bed reactor

    Directory of Open Access Journals (Sweden)

    Mark Daniel G. de Luna

    2017-05-01

    Full Text Available This study used crystallization in a fluidized-bed reactor as an alternative technology to the conventional chemical precipitation to remove sulfate. The Box-Behnken Design was used to study the effects and interactions of seed dosage of synthetic gypsum, initial sulfate concentration and molar ratio of calcium to sulfate on conversion and removal of sulfate. The optimum conditions of conversion and removal of sulfate were determined and used to treat the simulated acid mine drainage (AMD wastewater. The effect of inorganic ions CO32−, NH4+ and Al3+ on sulfate conversion was also investigated. Experimental results indicated that seed dosage, initial sulfate concentration and molar ratio of calcium to sulfate are all significant parameters in the sulfate removal by fluidized-bed crystallization. The optimal conditions of 4 g seed L−1, 119.7 mM of initial sulfate concentration and [Ca2+]/[SO42−] molar ratio of 1.48 resulted in sulfate conversion of 82% and sulfate removal of 67%. Conversion and removal of sulfate in the simulated AMD wastewater were 79 and 63%, respectively. When ammonium or aluminum was added to the synthetic sulfate wastewater, significant conversion of sulfate was achieved.

  1. Application of statistical experimental design for optimization of silver nanoparticles biosynthesis by a nanofactory Streptomyces viridochromogenes.

    Science.gov (United States)

    El-Naggar, Noura El-Ahmady; Abdelwahed, Nayera A M

    2014-01-01

    Central composite design was chosen to determine the combined effects of four process variables (AgNO3 concentration, incubation period, pH level and inoculum size) on the extracellular biosynthesis of silver nanoparticles (AgNPs) by Streptomyces viridochromogenes. Statistical analysis of the results showed that incubation period, initial pH level and inoculum size had significant effects (Pbiosynthesis of silver nanoparticles at their individual level. The maximum biosynthesis of silver nanoparticles was achieved at a concentration of 0.5% (v/v) of 1 mM AgNO3, incubation period of 96 h, initial pH of 9 and inoculum size of 2% (v/v). After optimization, the biosynthesis of silver nanoparticles was improved by approximately 5-fold as compared to that of the unoptimized conditions. The synthetic process of silver nanoparticle generation using the reduction of aqueous Ag+ ion by the culture supernatants of S. viridochromogenes was quite fast, and silver nanoparticles were formed immediately by the addition of AgNO3 solution (1 mM) to the cell-free supernatant. Initial characterization of silver nanoparticles was performed by visual observation of color change from yellow to intense brown color. UV-visible spectrophotometry for measuring surface plasmon resonance showed a single absorption peak at 400 nm, which confirmed the presence of silver nanoparticles. Fourier Transform Infrared Spectroscopy analysis provided evidence for proteins as possible reducing and capping agents for stabilizing the nanoparticles. Transmission Electron Microscopy revealed the extracellular formation of spherical silver nanoparticles in the size range of 2.15-7.27 nm. Compared to the cell-free supernatant, the biosynthesized AgNPs revealed superior antimicrobial activity against Gram-negative, Gram-positive bacterial strains and Candida albicans.

  2. The experimental design of postmortem studies: the effect size and statistical power.

    Science.gov (United States)

    Meurs, Joris

    2016-09-01

    The aim is of this study was to show the poor statistical power of postmortem studies. Further, this study aimed to find an estimate of the effect size for postmortem studies in order to show the importance of this parameter. This can be an aid in performing power analysis to determine a minimal sample size. GPower was used to perform calculations on sample size, effect size, and statistical power. The minimal significance (α) and statistical power (1 - β) were set at 0.05 and 0.80 respectively. Calculations were performed for two groups (Student's t-distribution) and multiple groups (one-way ANOVA; F-distribution). In this study, an average effect size of 0.46 was found (n = 22; SD = 0.30). Using this value to calculate the statistical power of another group of postmortem studies (n = 5) revealed that the average statistical power of these studies was poor (1 - β studies is considerable. In order to enhance statistical power of postmortem studies, power analysis should be performed in which the effect size found in this study can be used as a guideline.

  3. Statistical guidance for experimental design and data analysis of mutation detection in rare monogenic mendelian diseases by exome sequencing.

    Directory of Open Access Journals (Sweden)

    Degui Zhi

    Full Text Available Recently, whole-genome sequencing, especially exome sequencing, has successfully led to the identification of causal mutations for rare monogenic Mendelian diseases. However, it is unclear whether this approach can be generalized and effectively applied to other Mendelian diseases with high locus heterogeneity. Moreover, the current exome sequencing approach has limitations such as false positive and false negative rates of mutation detection due to sequencing errors and other artifacts, but the impact of these limitations on experimental design has not been systematically analyzed. To address these questions, we present a statistical modeling framework to calculate the power, the probability of identifying truly disease-causing genes, under various inheritance models and experimental conditions, providing guidance for both proper experimental design and data analysis. Based on our model, we found that the exome sequencing approach is well-powered for mutation detection in recessive, but not dominant, Mendelian diseases with high locus heterogeneity. A disease gene responsible for as low as 5% of the disease population can be readily identified by sequencing just 200 unrelated patients. Based on these results, for identifying rare Mendelian disease genes, we propose that a viable approach is to combine, sequence, and analyze patients with the same disease together, leveraging the statistical framework presented in this work.

  4. Optimization of the Hewlett-Packard particle-beam liquid chromatography-mass spectrometry interface by statistical experimental design.

    Science.gov (United States)

    Huang, S K; Garza, N R

    1995-06-01

    Optimization of both sensitivity and ionization softness for the Hewlett-Packard particle-beam liquid chromatography-mass spectrometry interface has been achieved by using a statistical experimental design with response surface modeling. Conditions for both optimized sensitivity and ionization softness were found to occur at 55-lb/in.(2) nebulizer flow, 35°C desolvation chamber temperature with approximately 45% organic modifier in the presence of 0.02-F ammonium acetate and a liquid chromatography flow rate of 0.2 mL/min.

  5. Application of statistical experimental designs for the optimization of medium constituents for the production of citric acid from pineapple waste.

    Science.gov (United States)

    Imandi, Sarat Babu; Bandaru, Veera Venkata Ratnam; Somalanka, Subba Rao; Bandaru, Sita Ramalakshmi; Garapati, Hanumantha Rao

    2008-07-01

    Statistical experimental designs were applied for the optimization of medium constituents for citric acid production by Yarrowia lipolytica NCIM 3589 in solid state fermentation (SSF) using pineapple waste as the sole substrate. Using Plackett-Burman design, yeast extract, moisture content of the substrate, KH(2)PO(4) and Na(2)HPO(4) were identified as significant variables which highly influenced citric acid production and these variables were subsequently optimized using a central composite design (CCD). The optimum conditions were found to be yeast extract 0.34 (%w/w), moisture content of the substrate 70.71 (%), KH(2)PO(4) 0.64 (%w/w) and Na(2)HPO(4) 0.69 (%w/w). Citric acid production at these optimum conditions was 202.35 g/kg ds (g citric acid produced/kg of dried pineapple waste as substrate).

  6. Design of bespoke lightweight cement mortars containing waste expanded polystyrene by experimental statistical methods

    OpenAIRE

    Ferrandiz, V; Sarabia, LA; Ortiz, MC; Cheeseman, CR; Garcia-Alcocel, E

    2015-01-01

    This work assesses the reuse of waste expanded polystyrene (EPS) to obtain lightweight cement mortars. The factors and interactions which affect the properties of these mortars were studied by ad-hoc designs based on the d-optimal criterion. This method allows multiple factors to be modified simultaneously, which reduces the number of experiments compared with classical design. Four factors were studied at several levels: EPS type (two levels), EPS content (two levels), admixtures mix (three ...

  7. A statistical experimental design to remove sulfate by crystallization in a fluidized-bed reactor

    OpenAIRE

    Mark Daniel G. de Luna; Rance, Diana Pearl M.; Luzvisminda M. Bellotindos; Lu, Ming-Chun

    2016-01-01

    This study used crystallization in a fluidized-bed reactor as an alternative technology to the conventional chemical precipitation to remove sulfate. The Box-Behnken Design was used to study the effects and interactions of seed dosage of synthetic gypsum, initial sulfate concentration and molar ratio of calcium to sulfate on conversion and removal of sulfate. The optimum conditions of conversion and removal of sulfate were determined and used to treat the simulated acid mine drainage (AMD) wa...

  8. Optimal experimental design to estimate statistically significant periods of oscillations in time course data.

    Directory of Open Access Journals (Sweden)

    Márcio Mourão

    Full Text Available We investigated commonly used methods (Autocorrelation, Enright, and Discrete Fourier Transform to estimate the periodicity of oscillatory data and determine which method most accurately estimated periods while being least vulnerable to the presence of noise. Both simulated and experimental data were used in the analysis performed. We determined the significance of calculated periods by applying these methods to several random permutations of the data and then calculating the probability of obtaining the period's peak in the corresponding periodograms. Our analysis suggests that the Enright method is the most accurate for estimating the period of oscillatory data. We further show that to accurately estimate the period of oscillatory data, it is necessary that at least five cycles of data are sampled, using at least four data points per cycle. These results suggest that the Enright method should be more widely applied in order to improve the analysis of oscillatory data.

  9. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    Science.gov (United States)

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    This report describes the experimental details and interprets results from a study conducted by the U.S. Geological Survey (USGS) in 1992 to assess the effect of different sample-processing treatments on the stability of eight nutrient species in samples of surface-, ground-, and municipal-supply water during storage at 4 degrees Celsius for about 30 days. Over a 7-week period, splits of filtered- and whole-water samples from 15 stations in the continental United States were preserved at collection sites with sulfuric acid (U.S. Environmental Protection Agency protocol), mercury (II) chloride (former U.S. Geological Survey protocol), and ASTM (American Society for Testing and Materials) Type I deionized water (control) and then shipped by overnight express to the USGS National Water Quality Laboratory (NWQL). At the NWQL, the eight nutrient species were determined in splits from each of the 15 stations, typically, within 24 hours of collection and at intervals of 3, 7, 14, 22, and 35 days thereafter. Ammonium, nitrate plus nitrite, nitrite, and orthophosphate were determined only in filtered-water splits. Kjeldahl nitrogen and phosphorus were determined in both filtered-water and whole-water splits.

  10. Optimization of Process Parameters by Statistical Experimental Designs for the Production of Naringinase Enzyme by Marine Fungi

    Directory of Open Access Journals (Sweden)

    Abeer Nasr Shehata

    2014-01-01

    Full Text Available Naringinase has attracted a great deal of attention in recent years due to its hydrolytic activities which include the production of rhamnose and prunin and debittering of citrus fruit juices. Screening of fifteen marine-derived fungi, locally isolated from Ismalia, Egypt, for naringinase enzyme production, indicated that Aspergillus niger was the most promising. In solid state fermentation (SSF of the agroindustrial waste, orange rind was used as a substrate containing naringin. Sequential optimization strategy, based on statistical experimental designs, was employed to enhance the production of the debittering naringinase enzyme. Effects of 19 variables were examined for their significance on naringinase production using Plackett-Burman factorial design. Significant parameters were further investigated using Taguchi’s (L16 45 orthogonal array design. Based on statistical analysis (ANOVA, the optimal combinations of the major constituents of media for maximal naringinase production were evaluated as follows: 15 g orange rind waste, 30 mL moisture content, 1% grape fruit, 1% NaNO3, 0.5% KH2PO4, 5 mM MgSO4, 5 mM FeSO4, and the initial pH 7.5. The activity obtained was more than 3.14-fold the basal production medium.

  11. Experimental design and statistical analysis in Rotating Disc Contactor (RDC) column

    Science.gov (United States)

    Ismail, Wan Nurul Aiffah; Zakaria, Siti Aisyah; Noor, Nor Fashihah Mohd; Ariffin, Wan Nor Munirah

    2015-12-01

    The purpose of this paper is to examine the performance of the liquid-liquid extraction in Rotating Disc Contactor (RDC) Column that being used in industries. In this study, the performance of small diameter column RDC using the chemical system involving cumene/isobutryric asid/water are analyzed by the method of design of the experiments (DOE) and also Multiple Linear Regression (MLR). The DOE method are used to estimated the effect of four independent. Otherwise, by using Multiple Linear Regression (MLR) is to justify the relationship between the input variables and output variables and also to determine which variable are more influence for both output variable. The input variables for both method include rotor speed (Nr); ratio of flow (Fd); concentration of continuous inlet (Ccin); concentration of dispersed inlet (Cdin); interaction between Nr with Fd; interaction between Nr with Ccin; interaction Nr with Cdin. Meanwhile the output variables are concentration of continuous outlet (Ccout) and concentration of dispersed outlet (Cdout) on RDC column performance. By using this two method, we have two linear model represent two output of Ccout and Cdout for MLR. Lastly, the researcher want to determine which input variable that give more influence to output variable by using this two method. Based on the result, we obtained that rotor speed (Nr) more influence to dependent variable, Ccout and concentration of continuous inlet (Ccin) more influence to dependent variable, Cdout according the two method that was used.

  12. Citric acid production by a novel Aspergillus niger isolate: II. Optimization of process parameters through statistical experimental designs.

    Science.gov (United States)

    Lotfy, Walid A; Ghanem, Khaled M; El-Helow, Ehab R

    2007-12-01

    In this work, sequential optimization strategy, based on statistical designs, was employed to enhance the production of citric acid in submerged culture. For screening of fermentation medium composition significantly influencing citric acid production, the two-level Plackett-Burman design was used. Under our experimental conditions, beet molasses and corn steep liquor were found to be the major factors of the acid production. A near optimum medium formulation was obtained using this method with increased citric acid yield by five-folds. Response surface methodology (RSM) was adopted to acquire the best process conditions. In this respect, the three-level Box-Behnken design was applied. A polynomial model was created to correlate the relationship between the three variables (beet molasses, corn steep liquor and inoculum concentration) and citric acid yield. Estimated optimum composition for the production of citric acid is as follows pretreated beet molasses, 240.1g/l; corn steep liquor, 10.5g/l; and spores concentration, 10(8)spores/ml. The optimum citric acid yield was 87.81% which is 14 times than the basal medium. The five level central composite design was used for outlining the optimum values of the fermentation factors initial pH, aeration rate and temperature on citric acid production. Estimated optimum values for the production of citric acid are as follows initial pH 4.0; aeration rate, 6500ml/min and fermentation temperature, 31.5 degrees C.

  13. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  14. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  15. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  16. Aplication of the statistical experimental design to optimize mine-impacted water (MIW) remediation using shrimp-shell.

    Science.gov (United States)

    Núñez-Gómez, Dámaris; Alves, Alcione Aparecida de Almeida; Lapolli, Flavio Rubens; Lobo-Recio, María A

    2017-01-01

    Mine-impacted water (MIW) is one of the most serious mining problems and has a high negative impact on water resources and aquatic life. The main characteristics of MIW are a low pH (between 2 and 4) and high concentrations of SO42- and metal ions (Cd, Cu, Ni, Pb, Zn, Fe, Al, Cr, Mn, Mg, etc.), many of which are toxic to ecosystems and human life. Shrimp shell was selected as a MIW treatment agent because it is a low-cost metal-sorbent biopolymer with a high chitin content and contains calcium carbonate, an acid-neutralizing agent. To determine the best metal-removal conditions, a statistical study using statistical planning was carried out. Thus, the objective of this work was to identify the degree of influence and dependence of the shrimp-shell content for the removal of Fe, Al, Mn, Co, and Ni from MIW. In this study, a central composite rotational experimental design (CCRD) with a quadruplicate at the midpoint (22) was used to evaluate the joint influence of two formulation variables-agitation and the shrimp-shell content. The statistical results showed the significant influence (p < 0.05) of the agitation variable for Fe and Ni removal (linear and quadratic form, respectively) and of the shrimp-shell content variable for Mn (linear form), Al and Co (linear and quadratic form) removal. Analysis of variance (ANOVA) for Al, Co, and Ni removal showed that the model is valid at the 95% confidence interval and that no adjustment needed within the ranges evaluated of agitation (0-251.5 rpm) and shrimp-shell content (1.2-12.8 g L-1). The model required adjustments to the 90% and 75% confidence interval for Fe and Mn removal, respectively. In terms of efficiency in removing pollutants, it was possible to determine the best experimental values of the variables considered as 188 rpm and 9.36 g L-1 of shrimp-shells. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Statistical experimental design guided optimization of a one-pot biphasic multienzyme total synthesis of amorpha-4,11-diene.

    Directory of Open Access Journals (Sweden)

    Xixian Chen

    Full Text Available In vitro synthesis of chemicals and pharmaceuticals using enzymes is of considerable interest as these biocatalysts facilitate a wide variety of reactions under mild conditions with excellent regio-, chemo- and stereoselectivities. A significant challenge in a multi-enzymatic reaction is the need to optimize the various steps involved simultaneously so as to obtain high-yield of a product. In this study, statistical experimental design was used to guide the optimization of a total synthesis of amorpha-4,11-diene (AD using multienzymes in the mevalonate pathway. A combinatorial approach guided by Taguchi orthogonal array design identified the local optimum enzymatic activity ratio for Erg12:Erg8:Erg19:Idi:IspA to be 100∶100∶1∶25∶5, with a constant concentration of amorpha-4,11-diene synthase (Ads, 100 mg/L. The model also identified an unexpected inhibitory effect of farnesyl pyrophosphate synthase (IspA, where the activity was negatively correlated with AD yield. This was due to the precipitation of farnesyl pyrophosphate (FPP, the product of IspA. Response surface methodology was then used to optimize IspA and Ads activities simultaneously so as to minimize the accumulation of FPP and the result showed that Ads to be a critical factor. By increasing the concentration of Ads, a complete conversion (∼100% of mevalonic acid (MVA to AD was achieved. Monovalent ions and pH were effective means of enhancing the specific Ads activity and specific AD yield significantly. The results from this study represent the first in vitro reconstitution of the mevalonate pathway for the production of an isoprenoid and the approaches developed herein may be used to produce other isopentenyl pyrophosphate (IPP/dimethylallyl pyrophosphate (DMAPP based products.

  18. Formulation development of ambroxol hydrochloride soft gel with application of statistical experimental design and response surface methodology.

    Science.gov (United States)

    Dabhi, Mahesh; Gohel, Mukesh; Parikh, Rajesh; Sheth, Navin; Nagori, Stavan

    2011-01-01

    The purpose of present work was to develop ambroxol hydrochloride soft gel formulation with the application of statistical experimental design and response surface methodology (RSM). A two-factor, three-level (3(2)) full factorial design of experiment with RSM was run to evaluate the main and interaction effect of two independent formulation variables that included the amount of low-acetylated gellan gum and sodium citrate. The dependent variables included viscosity (Y(1)), amount of drug release at 10 min (Y(2)) and 30 min (Y(3)), and gelation time (Y(4)). In order to obtain a formulation having the maximum amount of drug release at 10 min and minimum gelation time, RSM optimization was used. The prepared formulations were evaluated for pH, viscosity, rheological properties, gelation time, drug content, in vitro drug release, appearance, and taste. All the formulations showed a gelation time in the range of 6 to 48 min. The drug content in all the formulations was within limit (99.6 ± 1.56%). The viscosity of all the formulations was found in the range of 1872-12,182 cP. Dissolution studies of the formulations showed drug release in the range of 40.56-72.46% within 10 min and 80.2-100.5% within 30 min. Human evaluation tests revealed that all the gels possessed acceptable characteristics. This study showed that the soft gel formulation GA5, containing 0.3% of gellan gum and 0.4% of sodium citrate, has potential use as an immediate release soft gel for oral drug delivery. The objective of this investigation was to develop a new, immediate-release, soft gel dosage form for ambroxol hydrochloride, an oral expectorant and mucolytic agent. This novel soft gel dosage form needs to be suitable for pediatric and geriatric patients as well as patients with dysphagia. A statistical technique was used for optimization of the gel formulation. The methodology, called a design of experiment with response surface methodology, evaluated several independent formulation variables

  19. Statistical experimental methods for optimizing the cultivating ...

    African Journals Online (AJOL)

    Central composite experimental design and response surface analysis were adopted to derive a statistical model for optimizing the culture conditions. From the obtained results, it can be concluded that the optimum parameters were: temperature, 15.3°C; pH, 5.56; inoculum size, 4%; liquid volume, 70 ml in 250 ml flask; ...

  20. Improvement of production of citric acid from oil palm empty fruit bunches: optimization of media by statistical experimental designs.

    Science.gov (United States)

    Bari, Md Niamul; Alam, Md Zahangir; Muyibi, Suleyman A; Jamal, Parveen; Abdullah-Al-Mamun

    2009-06-01

    A sequential optimization based on statistical design and one-factor-at-a-time (OFAT) method was employed to optimize the media constituents for the improvement of citric acid production from oil palm empty fruit bunches (EFB) through solid state bioconversion using Aspergillus niger IBO-103MNB. The results obtained from the Plackett-Burman design indicated that the co-substrate (sucrose), stimulator (methanol) and minerals (Zn, Cu, Mn and Mg) were found to be the major factors for further optimization. Based on the OFAT method, the selected medium constituents and inoculum concentration were optimized by the central composite design (CCD) under the response surface methodology (RSM). The statistical analysis showed that the optimum media containing 6.4% (w/w) of sucrose, 9% (v/w) of minerals and 15.5% (v/w) of inoculum gave the maximum production of citric acid (337.94 g/kg of dry EFB). The analysis showed that sucrose (pcitric acid production.

  1. A statistical experimental design approach to evaluate the influence of various penetration enhancers on transdermal drug delivery of buprenorphine.

    Science.gov (United States)

    Taghizadeh, S Mojtaba; Moghimi-Ardakani, Ali; Mohamadnia, Fatemeh

    2015-03-01

    A series of drug-in-adhesive transdermal drug delivery systems (patch) with different chemical penetration enhancers were designed to deliver drug through the skin as a site of application. The objective of our effort was to study the influence of various chemical penetration enhancers on skin permeation rate and adhesion properties of a transdermal drug delivery system using Box-Behnken experimental design. The response surface methodology based on a three-level, three-variable Box-Behnken design was used to evaluate the interactive effects on dependent variables including, the rate of skin permeation and adhesion properties, namely peel strength and tack value. Levulinic acid, lauryl alcohol, and Tween 80 were used as penetration enhancers (patch formulations, containing 0-8% of each chemical penetration enhancer). Buprenorphine was used as a model penetrant drug. The results showed that incorporation of 20% chemical penetration enhancer into the mixture led to maximum skin permeation flux of buprenorphine from abdominal rat skin while the adhesion properties decreased. Also that skin flux in presence of levulinic acid (1.594 μg/cm(2) h) was higher than Tween 80 (1.473 μg/cm(2) h) and lauryl alcohol (0.843 μg/cm(2) h), and in mixing these enhancers together, an additional effect was observed. Moreover, it was found that each enhancer increased the tack value, while levulinic acid and lauryl alcohol improved the peel strength but Tween 80 reduced it. These findings indicated that the best chemical skin penetration enhancer for buprenorphine patch was levulinic acid. Among the designed formulations, the one which contained 12% (wt/wt) enhancers exhibited the highest efficiency.

  2. A statistical experimental design approach to evaluate the influence of various penetration enhancers on transdermal drug delivery of buprenorphine

    Directory of Open Access Journals (Sweden)

    S.Mojtaba Taghizadeh

    2015-03-01

    Full Text Available A series of drug-in-adhesive transdermal drug delivery systems (patch with different chemical penetration enhancers were designed to deliver drug through the skin as a site of application. The objective of our effort was to study the influence of various chemical penetration enhancers on skin permeation rate and adhesion properties of a transdermal drug delivery system using Box–Behnken experimental design. The response surface methodology based on a three-level, three-variable Box–Behnken design was used to evaluate the interactive effects on dependent variables including, the rate of skin permeation and adhesion properties, namely peel strength and tack value. Levulinic acid, lauryl alcohol, and Tween 80 were used as penetration enhancers (patch formulations, containing 0–8% of each chemical penetration enhancer. Buprenorphine was used as a model penetrant drug. The results showed that incorporation of 20% chemical penetration enhancer into the mixture led to maximum skin permeation flux of buprenorphine from abdominal rat skin while the adhesion properties decreased. Also that skin flux in presence of levulinic acid (1.594 μg/cm2 h was higher than Tween 80 (1.473 μg/cm2 h and lauryl alcohol (0.843 μg/cm2 h, and in mixing these enhancers together, an additional effect was observed. Moreover, it was found that each enhancer increased the tack value, while levulinic acid and lauryl alcohol improved the peel strength but Tween 80 reduced it. These findings indicated that the best chemical skin penetration enhancer for buprenorphine patch was levulinic acid. Among the designed formulations, the one which contained 12% (wt/wt enhancers exhibited the highest efficiency.

  3. Advantages of neurofuzzy logic against conventional experimental design and statistical analysis in studying and developing direct compression formulations.

    Science.gov (United States)

    Landín, Mariana; Rowe, R C; York, P

    2009-11-05

    This study has investigated the utility and potential advantages of an artificial intelligence technology - neurofuzzy logic - as a modeling tool to study direct compression formulations. The modeling performance was compare with traditional statistical analysis. From results it can be stated that the normalized error obtained from neurofuzzy logic was lower. Compared to the multiple regression analysis neurofuzzy logic showed higher accuracy in prediction for the five outputs studied. Rule sets generated by neurofuzzy logic are completely in agreement with the findings based on statistical analysis and advantageously generate understandable and reusable knowledge. Neurofuzzy logic is easy and rapid to apply and outcomes provided knowledge not revealed via statistical analysis.

  4. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  5. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2016-08-31

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.

  6. Application of statistical experimental design for optimisation of bioinsecticides production by sporeless Bacillus thuringiensis strain on cheap medium.

    Science.gov (United States)

    Ben Khedher, Saoussen; Jaoua, Samir; Zouari, Nabil

    2013-01-01

    In order to overproduce bioinsecticides production by a sporeless Bacillus thuringiensis strain, an optimal composition of a cheap medium was defined using a response surface methodology. In a first step, a Plackett-Burman design used to evaluate the effects of eight medium components on delta-endotoxin production showed that starch, soya bean and sodium chloride exhibited significant effects on bioinsecticides production. In a second step, these parameters were selected for further optimisation by central composite design. The obtained results revealed that the optimum culture medium for delta-endotoxin production consists of 30 g L(-1) starch, 30 g L(-1) soya bean and 9 g L(-1) sodium chloride. When compared to the basal production medium, an improvement in delta-endotoxin production up to 50% was noted. Moreover, relative toxin yield of sporeless Bacillus thuringiensis S22 was improved markedly by using optimised cheap medium (148.5 mg delta-endotoxins per g starch) when compared to the yield obtained in the basal medium (94.46 mg delta-endotoxins per g starch). Therefore, the use of optimised culture cheap medium appeared to be a good alternative for a low cost production of sporeless Bacillus thuringiensis bioinsecticides at industrial scale which is of great importance in practical point of view.

  7. Application of statistical experimental design for optimisation of bioinsecticides production by sporeless Bacillus thuringiensis strain on cheap medium

    Directory of Open Access Journals (Sweden)

    Saoussen Ben Khedher

    2013-09-01

    Full Text Available In order to overproduce bioinsecticides production by a sporeless Bacillus thuringiensis strain, an optimal composition of a cheap medium was defined using a response surface methodology. In a first step, a Plackett-Burman design used to evaluate the effects of eight medium components on delta-endotoxin production showed that starch, soya bean and sodium chloride exhibited significant effects on bioinsecticides production. In a second step, these parameters were selected for further optimisation by central composite design. The obtained results revealed that the optimum culture medium for delta-endotoxin production consists of 30 g L-1 starch, 30 g L-1 soya bean and 9g L-1 sodium chloride. When compared to the basal production medium, an improvement in delta-endotoxin production up to 50% was noted. Moreover, relative toxin yield of sporeless Bacillus thuringiensis S22 was improved markedly by using optimised cheap medium (148.5 mg delta-endotoxins per g starch when compared to the yield obtained in the basal medium (94.46 mg delta-endotoxins per g starch. Therefore, the use of optimised culture cheap medium appeared to be a good alternative for a low cost production of sporeless Bacillus thuringiensis bioinsecticides at industrial scale which is of great importance in practical point of view.

  8. Optimization of process variables for the biosynthesis of silver nanoparticles by Aspergillus wentii using statistical experimental design

    Science.gov (United States)

    Biswas, Supratim; Mulaba-Bafubiandi, Antoine F.

    2016-12-01

    The present scientific endeavour focuses on the optimization of process parameters using central composite design towards development of an efficient technique for the biosynthesis of silver nanoparticles. The combined effects of three process variables (days of fermentation, duration of incubation, concentration of AgNO3) upon extracellular biological synthesis of silver nanoparticles (AgNPs) by Aspergillus wentii NCIM 667 were studied. A single absorption peak at 455 nm confirming the presence of silver nanoparticles was observed in the UV-visible spectrophotometric graph. Using Fourier transform infrared spectroscopic analysis the presence of proteins as viable reducing agents for the formation AgNPs was recorded. High resolution transmission electron microscopy showed the realization of spherically shaped AgNPs of size 15-40 nm. Biologically formed AgNPs revealed higher antimicrobial activity against gram-negative than gram-positive bacterial strains. We present the enumeration of the properties of biosynthesized nanoparticles which exhibit photocatalysis exhausting an organic dye, the methyl orange, upon exposure to sunlight thereby accomplishing the degradation of almost (88%) the methyl orange dye within 5 h.

  9. Improvement of a two-stage fermentation process for docosahexaenoic acid production by Aurantiochytrium limacinum SR21 applying statistical experimental designs and data analysis.

    Science.gov (United States)

    Rosa, Silvina Mariana; Soria, Marcelo Abel; Vélez, Carlos Guillermo; Galvagno, Miguel Angel

    2010-04-01

    Statistical screening experimental designs were applied to identify the significant culture variables for biomass production of Aurantiochytrium limacinum SR21 and their optimal levels were found using a combination of Artificial Neural Networks, genetic algorithms and graphical analysis. The biomass value obtained (40.3g cell dry weight l(-1)) employing the selected culture conditions agreed with that predicted by the model. Subsequently, two significant culture conditions for docosahexaenoic acid (DHA) production were determined, finding that an inoculum of 10% (v/v), obtained from the previous (statistically optimized) stage, should be used in a DHA production medium having a molar C:N ratio of 55:1, to reach a production of 7.8 g DHA l(-1) d(-1). The production step was thereafter scaled in a 3.5l bioreactor, and DHA productivity of 3.7 g l(-1) d(-1) was obtained. This two-stage strategy: statistically optimized inoculum production (fist step) and a DHA production step, is presented for the first time to optimize a bioprocess conducive to the obtention of microbial DHA. Copyright 2009 Elsevier Ltd. All rights reserved.

  10. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    Science.gov (United States)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  11. Sudoku Squares as Experimental Designs

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 20; Issue 9. Sudoku Squares as Experimental Designs. Jyotirmoy Sarkar Bikas K Sinha. General Article ... Author Affiliations. Jyotirmoy Sarkar1 Bikas K Sinha2. Indiana University-Purdue University, Indianapolis, USA; Indian Statistical Institute, Kolkata ...

  12. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  13. Experimental Design Research

    DEFF Research Database (Denmark)

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations...... of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology......, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current...

  14. Statistical problems in design technique validation

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.S.

    1980-04-01

    This work is concerned with the statistical validation process for measuring the accuracy of design techniques for solar energy systems. This includes a discussion of the statistical variability inherent in the design and measurement processes and the way in which this variability can dictate the choice of experimental design, choice of data, accuracy of the results, and choice of questions that can be reliably answered in such a study. The approach here is primarily concerned with design procedure validation in the context of the realistic process of system desig, where the discrepancy between measured and predicted results is due to limitations in the mathematical models employed by the procedures and the inaccuracies of input data. A set of guidelines for successful validation methodologies is discussed, and a simplified validation methodology for domestic hot water heaters is presented.

  15. Hydrogen production from biomass. Optimization of gasification by experimental by experimental statistical design; Produccion de hidrogeno a partir de biomasa. Optimizacion de la gasificacion por aplicacion del diseno estadistico de experimentos

    Energy Technology Data Exchange (ETDEWEB)

    Arteche Calvo, A.

    2008-07-01

    Biomass conversion into a gas with high content in hydrogen is considered as a future alternative to obtain energy and chemicals products for renewable sources. One of the current technologies for this purpose is the gasification using steam as gasification agent. The technical objective of this work is the study of the process of biomass gasification with steam and oxygen as thermochemical process of transformation of biomass to obtain the maximum amount of hydrogen with lowest tar content. Materials and Methods. An experimental statistical strategy with three variables and two levels of operation was planned to optimize the gasification process. the study was conducted without changing the type of biomass-fed, the type of catalyst used and the quantity of bed inside the gasifier. Two mathematical models have been obtained as results. Both of them correlated the experimental factors to the production of hydrogen and tars. The design of experiments methodology has been applied to assess the influence os several experimental factors, such as the introduced amount of steam, the use of catalyst and oxygen, both in the production of hydrogen, as in the minimization of the formation of tars. This statistical technique has enabled the modeling of the selected biomass gasification performing the minimum number of pilot plant tests to identify possible improvements and optimizations both in the yield of produced hydrogen as in the generation of tars. (Author) 10 refs.

  16. Application of machine/statistical learning, artificial intelligence and statistical experimental design for the modeling and optimization of methylene blue and Cd(ii) removal from a binary aqueous solution by natural walnut carbon.

    Science.gov (United States)

    Mazaheri, H; Ghaedi, M; Ahmadi Azqhandi, M H; Asfaram, A

    2017-05-10

    Analytical chemists apply statistical methods for both the validation and prediction of proposed models. Methods are required that are adequate for finding the typical features of a dataset, such as nonlinearities and interactions. Boosted regression trees (BRTs), as an ensemble technique, are fundamentally different to other conventional techniques, with the aim to fit a single parsimonious model. In this work, BRT, artificial neural network (ANN) and response surface methodology (RSM) models have been used for the optimization and/or modeling of the stirring time (min), pH, adsorbent mass (mg) and concentrations of MB and Cd2+ ions (mg L-1) in order to develop respective predictive equations for simulation of the efficiency of MB and Cd2+ adsorption based on the experimental data set. Activated carbon, as an adsorbent, was synthesized from walnut wood waste which is abundant, non-toxic, cheap and locally available. This adsorbent was characterized using different techniques such as FT-IR, BET, SEM, point of zero charge (pHpzc) and also the determination of oxygen containing functional groups. The influence of various parameters (i.e. pH, stirring time, adsorbent mass and concentrations of MB and Cd2+ ions) on the percentage removal was calculated by investigation of sensitive function, variable importance rankings (BRT) and analysis of variance (RSM). Furthermore, a central composite design (CCD) combined with a desirability function approach (DFA) as a global optimization technique was used for the simultaneous optimization of the effective parameters. The applicability of the BRT, ANN and RSM models for the description of experimental data was examined using four statistical criteria (absolute average deviation (AAD), mean absolute error (MAE), root mean square error (RMSE) and coefficient of determination (R2)). All three models demonstrated good predictions in this study. The BRT model was more precise compared to the other models and this showed that BRT

  17. iCFD: Interpreted Computational Fluid Dynamics – Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design – The secondary clarifier

    DEFF Research Database (Denmark)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat

    2015-01-01

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models – computationally light tools, used e.g., as sub-models in systems analysis. The objective is ...... range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events....

  18. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  19. Experimental Engineering: Articulating and Valuing Design Experimentation

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Grönvall, Erik; Fritsch, Jonas

    2017-01-01

    In this paper we propose Experimental Engineering as a way to articulate open- ended technological experiments as a legitimate design research practice. Experimental Engineering introduces a move away from an outcome or result driven design process towards an interest in existing technologies...

  20. Intermediate/Advanced Research Design and Statistics

    Science.gov (United States)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  1. Factors screening to statistical experimental design of racemic atenolol kinetic resolution via transesterification reaction in organic solvent using free Pseudomonas fluorescens lipase.

    Science.gov (United States)

    Agustian, Joni; Kamaruddin, Azlina Harun; Aboul-Enein, Hassan Y

    2017-07-01

    As the (R)-enantiomer of racemic atenolol has no β-blocking activity and no lack of side effects, switching from the racemate to the (S)-atenolol is more favorable. Transesterification of racemic atenolol using free enzymes investigated as a resource to resolve the racemate via this method is limited. Screenings of enzyme, medium, and acetyl donor were conducted first to give Pseudomonas fluorescens lipase, tetrahydrofuran, and vinyl acetate. A statistical design of the experiment was then developed using Central Composite Design on some operational factors, which resulted in the conversions of 11.70-61.91% and substrate enantiomeric excess (ee) of 7.31-100%. The quadratic models are acceptable with R 2 of 95.13% (conversion) and 89.63% (ee). The predicted values match the observed values reasonably well. Temperature, agitation speed, and substrate molar ratio factor have low effects on conversion and ee, but enzyme loading affects the responses highly. The interaction of temperature-agitation speed and temperature-substrate molar ratio show significant effects on conversion, while temperature-agitation speed, temperature-substrate molar ratio, and agitation speed-substrate molar ratio affect ee highly. Optimum conditions for the use of Pseudomonas fluorescens lipase, tetrahydrofuran, and vinyl acetate were found at 45°C, 175 rpm, 2000 U, and 1:3.6 substrate molar ratio. © 2017 Wiley Periodicals, Inc.

  2. Statistical evaluation of the medium components for the production of high biomass, α-amylase and protease enzymes by Piriformospora indica using Plackett-Burman experimental design.

    Science.gov (United States)

    Swetha, S; Varma, Ajit; Padmavathi, T

    2014-08-01

    Piriformospora indica, a member of basidiomycota is an axenically cultivable endophytic fungus which exerts plant growth promoting effects on its host plant. P. indica is known to produce α-amylase and protease. Since the organism exhibits beneficial role in plant growth promotion, achieving high biomass is immensely essential. Hence to enable the commercial production, screening of medium components is a necessary step. The present paper investigates the screening of medium components using Plackett-Burman experimental design wherein the parameters such as α-amylase, protease and biomass have been examined. The parameters α-amylase, protease and biomass was found to vary from 0.25 to 0.45 mg-1 ml-1 min-1, 0.1 to 0.15 mg-1 ml-1 h-1 and 0.8 to 22.6 g l-1, respectively, in 16 runs which demonstrates the strong influence of the medium components.

  3. Statistical experimental methods for optimizing the cultivating ...

    African Journals Online (AJOL)

    Jane

    2011-08-08

    Burman design, central composite design, response surface methodology. INTRODUCTION. Aflatoxins (AFs) are a family of toxic secondary metabolites produced by certain strains of the common molds Aspergillus flavus and ...

  4. Experimental design a chemometric approach

    CERN Document Server

    Deming, SN

    1987-01-01

    Now available in a paperback edition is a book which has been described as ``...an exceptionally lucid, easy-to-read presentation... would be an excellent addition to the collection of every analytical chemist. I recommend it with great enthusiasm.'' (Analytical Chemistry). Unlike most current textbooks, it approaches experimental design from the point of view of the experimenter, rather than that of the statistician. As the reviewer in `Analytical Chemistry' went on to say: ``Deming and Morgan should be given high praise for bringing the principles of experimental design to the level of the p

  5. Experimental design in chemistry: A tutorial.

    Science.gov (United States)

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  6. Statistical Manual. Methods of Making Experimental Inferences

    Science.gov (United States)

    1951-06-01

    quantities, and not to reach any final decision. Estimation, or the reporting of quantities in tabular or graphical form, is actually an incomplete...computations, simply because they have the appearance of lending precision to one’s results. Quite often a tabular or graphical presentation will reveal...DATA. "Sensitivity dato " is a P™***™ applied to that type of experimental dato to’ JfJJJ, act of taking a measurement on »*^££~ of ^e destroys the

  7. Design approaches to experimental mediation.

    Science.gov (United States)

    Pirlott, Angela G; MacKinnon, David P

    2016-09-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., "measurement-of-mediation" designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable.

  8. Scientific, statistical, practical, and regulatory considerations in design space development.

    Science.gov (United States)

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  9. Experimental design in chromatography: a tutorial review.

    Science.gov (United States)

    Hibbert, D Brynn

    2012-12-01

    The ability of a chromatographic method to successful separate, identify and quantitate species is determined by many factors, many of which are in the control of the experimenter. When attempting to discover the important factors and then optimise a response by tuning these factors, experimental design (design of experiments, DoE) gives a powerful suite of statistical methodology. Advantages include modelling by empirical functions, not requiring detailed knowledge of the underlying physico-chemical properties of the system, a defined number of experiments to be performed, and available software to accomplish the task. Two uses of DoE in chromatography are for showing lack of significant effects in robustness studies for method validation, and for identifying significant factors and then optimising a response with respect to them in method development. Plackett-Burman designs are widely used in validation studies, and fractional factorial designs and their extensions such as central composite designs are the most popular optimisers. Box-Behnken and Doehlert designs are becoming more used as efficient alternatives. If it is not possible to practically realise values of the factors required by experimental designs, or if there is a constraint on the total number of experiments that can be done, then D-optimal designs can be very powerful. Examples of the use of DoE in chromatography are reviewed. Recommendations are given on how to report DoE studies in the literature. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Potential Pitfalls of Experimental Design

    OpenAIRE

    Phillip Watkins

    2017-01-01

    Good experimental design begins with the end in mind. An early conversation with a statistician will both increase the chances of an experimental study contributing to the literature and minimize the risks to participating human subjects.  Sir R.A. Fisher felt that “to consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination: he can perhaps say what the experiment died of.” To this end, some questions from a statistician are pres...

  11. ALGORITHM OF PRIMARY STATISTICAL ANALYSIS OF ARRAYS OF EXPERIMENTAL DATA

    Directory of Open Access Journals (Sweden)

    LAUKHIN D. V.

    2017-02-01

    Full Text Available Annotation. Purpose. Construction of an algorithm for preliminary (primary estimation of arrays of experimental data for further obtaining a mathematical model of the process under study. Methodology. The use of the main regularities of the theory of processing arrays of experimental values in the initial analysis of data. Originality. An algorithm for performing a primary statistical analysis of the arrays of experimental data is given. Practical value. Development of methods for revealing statistically unreliable values in arrays of experimental data for the purpose of their subsequent detailed analysis and construction of a mathematical model of the studied processes.

  12. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  13. Potential Pitfalls of Experimental Design

    Directory of Open Access Journals (Sweden)

    Phillip Watkins

    2017-01-01

    Full Text Available Good experimental design begins with the end in mind. An early conversation with a statistician will both increase the chances of an experimental study contributing to the literature and minimize the risks to participating human subjects.  Sir R.A. Fisher felt that “to consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination: he can perhaps say what the experiment died of.” To this end, some questions from a statistician are presented along with the associated experimental study pitfalls to avoid during the study planning phase. Several concrete examples are provided to give some practical knowledge on how to improve an experimental study at the onset.  Hypothesis formulation, sample size determination, randomization, and double-blinding are all explained from the viewpoint of a statistician’s final analysis. Confounders, sampling, and missing data are also briefly covered through this hypothetical question and answer session.

  14. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    . This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...... estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  15. A Statistical Approach to Optimizing Concrete Mixture Design

    Directory of Open Access Journals (Sweden)

    Shamsad Ahmad

    2014-01-01

    Full Text Available A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33. A total of 27 concrete mixtures with three replicates (81 specimens were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48, cementitious materials content (350, 375, and 400 kg/m3, and fine/total aggregate ratio (0.35, 0.40, and 0.45. The experimental data were utilized to carry out analysis of variance (ANOVA and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  16. A statistical approach to optimizing concrete mixture design.

    Science.gov (United States)

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  17. Involving students in experimental design: three approaches.

    Science.gov (United States)

    McNeal, A P; Silverthorn, D U; Stratton, D B

    1998-12-01

    Many faculty want to involve students more actively in laboratories and in experimental design. However, just "turning them loose in the lab" is time-consuming and can be frustrating for both students and faculty. We describe three different ways of providing structures for labs that require students to design their own experiments but guide the choices. One approach emphasizes invertebrate preparations and classic techniques that students can learn fairly easily. Students must read relevant primary literature and learn each technique in one week, and then design and carry out their own experiments in the next week. Another approach provides a "design framework" for the experiments so that all students are using the same technique and the same statistical comparisons, whereas their experimental questions differ widely. The third approach involves assigning the questions or problems but challenging students to design good protocols to answer these questions. In each case, there is a mixture of structure and freedom that works for the level of the students, the resources available, and our particular aims.

  18. The Poisson Distribution: An Experimental Approach to Teaching Statistics

    Science.gov (United States)

    Lafleur, Mimi S.; And Others

    1972-01-01

    Explains an experimental approach to teaching statistics to students who are essentially either non-science and non-mathematics majors or just beginning study of science. With every day examples, the article illustrates the method of teaching Poisson Distribution. (PS)

  19. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  20. APPLICATION OF LABVIEW DURING THE PROCESSING OF EXPERIMENTAL DATA BY STATISTICAL METHODS

    Directory of Open Access Journals (Sweden)

    Halyna V. Lutsenko

    2013-05-01

    Full Text Available The peculiarities appeared in the process of training of Physics and Engineering Students at the study of statistical methods and its practical use have been analyzed. The main types of problems appeared in statistical processing of experimental data have been investigated. The technique of the using of LabVIEW to design of program module of experimentally acquired data statistical parameters estimation has been considered. The structure of established software and technique of its development by using LabVIEW have been described. The procedure of the software creation and set of main LabVIEW elements have been described.

  1. A Statistical Approach to Implosion Design on the OMEGA Laser

    Science.gov (United States)

    Gopalaswamy, V.; Betti, R.

    2017-10-01

    The 1-D campaign on OMEGA is backed by a novel approach aimed at producing an iterative and data-driven process to design optimized cryogenic implosions and improve the accuracy of 1-D physics models. The process does not preclude the possibility of significant systematic errors on OMEGA, nor does it assume that the hydrodynamic codes used in implosion design have all the necessary physical models. It only assumes that there exists some relationship between simulation and experimental results and uses statistical methods to model this relationship. Comparisons of hydrodynamic simulations of less-accurate physical models with more-accurate ones indicate that as long as equation of state is relatively well modeled, this assumption holds. By incorporating data from over 40 experiments on OMEGA, this approach has been used to design four targets with a two-shock pulse design for the 1-D campaign, and led to pre-shot predictions of yields within 5% and ion temperatures within 3% of the experimental values. One of these implosions has also produced the highest neutron yield (1.1 ×1014) on an OMEGA cryogenic implosion with an areal density of 105mgmgcm cm2 . The region of design space in which the predictive capability of this model is valid remains an open question. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  2. Experimental Design and Some Threats to Experimental Validity: A Primer

    Science.gov (United States)

    Skidmore, Susan

    2008-01-01

    Experimental designs are distinguished as the best method to respond to questions involving causality. The purpose of the present paper is to explicate the logic of experimental design and why it is so vital to questions that demand causal conclusions. In addition, types of internal and external validity threats are discussed. To emphasize the…

  3. The Use of an Experimental Design Approach to Investigate the ...

    African Journals Online (AJOL)

    NICO

    This study looked at using statistical design of experimental (DoE) principles to observe interactions between two graphite types and a nanocarbon ... Pb-acid battery, Pb-plate, graphite, expanders, design of experiment. 1. Introduction .... The CCA test was done by placing the cells in a freezer for 24 h at –18 ± 1 °C prior to ...

  4. Literature in Focus: Statistical Methods in Experimental Physics

    CERN Multimedia

    2007-01-01

    Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...

  5. [Review of research design and statistical methods in Chinese Journal of Cardiology].

    Science.gov (United States)

    Zhang, Li-jun; Yu, Jin-ming

    2009-07-01

    To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.

  6. Sudoku Squares as Experimental Designs

    Indian Academy of Sciences (India)

    IAS Admin

    Sudoku is a popular combinatorial puzzle. We give a brief overview of some mathematical fea- tures of a Sudoku square. Then we focus on in- terpreting Sudoku squares as experimental de- signs in order to meet a practical need. 1. Sudoku Squares and Sudoku Puzzles. The (standard) Sudoku square of order 9 entails a ...

  7. Experimental design and process optimization

    CERN Document Server

    Rodrigues, Maria Isabel; Dos Santos, Elian Luiz

    2014-01-01

    Initial ConsiderationsTopics of Elementary StatisticsIntroductory NotionsGeneral IdeasVariablesPopulations and Samples Importance of the Form of the PopulationFirst Ideas of Interference on a Normal PopulationParameters and EstimatesNotions on Testing HypothesesInference of the Mean of a Normal PopulationInference of the Variance of a Normal PopulationInference of the Means of Two Normal PopulationsIndependent SamplesPaired Samples L

  8. Statistical design and evaluation of biomarker studies.

    Science.gov (United States)

    Dobbin, Kevin K

    2014-01-01

    We review biostatistical aspects of biomarker studies, including design and analysis issues, covering the range of settings required for translational research-from early exploratory studies through clinical trials.

  9. An Introduction to Experimental Design Research

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2016-01-01

    Design research brings together influences from the whole gamut of social, psychological, and more technical sciences to create a tradition of empirical study stretching back over 50 years (Horvath 2004; Cross 2007). A growing part of this empirical tradition is experimental, which has gained...... design researcher. Thus, this book brings together leading researchers from across design research in order to provide the reader with a foundation in experimental design research; an appreciation of possible experimental perspectives; and insight into how experiments can be used to build robust...... and significant scientific knowledge. This chapter sets the stage for these discussions by introducing experimental design research, outlining the various types of experimental approach, and explaining the role of this book in the wider methodological context....

  10. Recommendations for statistical designs of in vivo mutagenicity tests with regard to subsequent statistical analysis.

    Science.gov (United States)

    Adler, I D; Bootman, J; Favor, J; Hook, G; Schriever-Schwemmer, G; Welzl, G; Whorton, E; Yoshimura, I; Hayashi, M

    1998-09-01

    A workshop was held on September 13 and 14, 1993, at the GSF, Neuherberg, Germany, to start a discussion of experimental design and statistical analysis issues for three in vivo mutagenicity test systems, the micronucleus test in mouse bone marrow/peripheral blood, the chromosomal aberration tests in mouse bone marrow/differentiating spermatogonia, and the mouse dominant lethal test. The discussion has now come to conclusions which we would like to make generally known. Rather than dwell upon specific statistical tests which could be used for data analysis, serious consideration was given to test design. However, the test design, its power of detecting a given increase of adverse effects and the test statistics are interrelated. Detailed analyses of historical negative control data led to important recommendations for each test system. Concerning the statistical sensitivity parameters, a type I error of 0.05 (one tailed), a type II error of 0.20 and a dose related increase of twice the background (negative control) frequencies were generally adopted. It was recommended that sufficient observations (cells, implants) be planned for each analysis unit (animal) so that at least one adverse outcome (micronucleus, aberrant cell, dead implant) would likely be observed. The treated animal was the smallest unit of analysis allowed. On the basis of these general consideration the sample size was determined for each of the three assays. A minimum of 2000 immature erythrocytes/animal should be scored for micronuclei from each of at least 4 animals in each comparison group in the micronucleus assays. A minimum of 200 cells should be scored for chromosomal aberrations from each of at least 5 animals in each comparison group in the aberration assays. In the dominant lethal test, a minimum of 400 implants (40-50 pregnant females) are required per dose group for each mating period. The analysis unit for the dominant lethal test would be the treated male unless the background

  11. Chemicals-Based Formulation Design: Virtual Experimentations

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul

    2011-01-01

    This paper presents a systematic procedure for virtual experimentations related to the design of liquid formulated products. All the experiments that need to be performed when designing a liquid formulated product (lotion), such as ingredients selection and testing, solubility tests, property mea...... on the design of an insect repellent lotion will show that the software is an essential instrument in decision making, and that it reduces time and resources since experimental efforts can be focused on one or few product alternatives.......This paper presents a systematic procedure for virtual experimentations related to the design of liquid formulated products. All the experiments that need to be performed when designing a liquid formulated product (lotion), such as ingredients selection and testing, solubility tests, property...... measurements, can now be performed through the virtual Product-Process Design laboratory [[1], [2] and [3

  12. Statistical Contributions to Bioinformatics: Design, Modeling, Structure Learning, and Integration.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran

    2017-01-01

    The advent of high-throughput multi-platform genomics technologies providing whole-genome molecular summaries of biological samples has revolutionalized biomedical research. These technologiees yield highly structured big data, whose analysis poses significant quantitative challenges. The field of Bioinformatics has emerged to deal with these challenges, and is comprised of many quantitative and biological scientists working together to effectively process these data and extract the treasure trove of information they contain. Statisticians, with their deep understanding of variability and uncertainty quantification, play a key role in these efforts. In this article, we attempt to summarize some of the key contributions of statisticians to bioinformatics, focusing on four areas: (1) experimental design and reproducibility, (2) preprocessing and feature extraction, (3) unified modeling, and (4) structure learning and integration. In each of these areas, we highlight some key contributions and try to elucidate the key statistical principles underlying these methods and approaches. Our goals are to demonstrate major ways in which statisticians have contributed to bioinformatics, encourage statisticians to get involved early in methods development as new technologies emerge, and to stimulate future methodological work based on the statistical principles elucidated in this article and utilizing all availble information to uncover new biological insights.

  13. Yakima Hatchery Experimental Design : Annual Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Busack, Craig; Knudsen, Curtis; Marshall, Anne

    1991-08-01

    This progress report details the results and status of Washington Department of Fisheries' (WDF) pre-facility monitoring, research, and evaluation efforts, through May 1991, designed to support the development of an Experimental Design Plan (EDP) for the Yakima/Klickitat Fisheries Project (YKFP), previously termed the Yakima/Klickitat Production Project (YKPP or Y/KPP). This pre- facility work has been guided by planning efforts of various research and quality control teams of the project that are annually captured as revisions to the experimental design and pre-facility work plans. The current objective are as follows: to develop genetic monitoring and evaluation approach for the Y/KPP; to evaluate stock identification monitoring tools, approaches, and opportunities available to meet specific objectives of the experimental plan; and to evaluate adult and juvenile enumeration and sampling/collection capabilities in the Y/KPP necessary to measure experimental response variables.

  14. Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs

    CERN Document Server

    Shen, Ruijing; Yu, Hao

    2012-01-01

    Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have  become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits.  Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...

  15. Experimental design optimization for screening relevant free ...

    African Journals Online (AJOL)

    Experimental design methodology was used to optimize the HPLC separation of various relevant phenolic acids from an artificial mixture. The effect of four characteristic factors of the HPLC procedure on the Chromatographic Response Function was investigated by a Central Composite Face-Centred Design and Multi ...

  16. [Statistical Process Control applied to viral genome screening: experimental approach].

    Science.gov (United States)

    Reifenberg, J M; Navarro, P; Coste, J

    2001-10-01

    During the National Multicentric Study concerning the introduction of NAT for HCV and HIV-1 viruses in blood donation screening which was supervised by the Medical and Scientific departments of the French Blood Establishment (Etablissement français du sang--EFS), Transcription-Mediated transcription Amplification (TMA) technology (Chiron/Gen Probe) was experimented in the Molecular Biology Laboratory of Montpellier, EFS Pyrénées-Méditerranée. After a preliminary phase of qualification of the material and training of the technicians, routine screening of homologous blood and apheresis donations using this technology was applied for two months. In order to evaluate the different NAT systems, exhaustive daily operations and data were registered. Among these, the luminescence results expressed as RLU of the positive and negative calibrators and the associated internal controls were analysed using Control Charts, Statistical Process Control methods, which allow us to display rapidly process drift and to anticipate the appearance of incidents. This study demonstrated the interest of these quality control methods, mainly used for industrial purposes, to follow and to increase the quality of any transfusion process. it also showed the difficulties of the post-investigations of uncontrolled sources of variations of a process which was experimental. Such tools are in total accordance with the new version of the ISO 9000 norms which are particularly focused on the use of adapted indicators for processes control, and could be extended to other transfusion activities, such as blood collection and component preparation.

  17. Optimal Bayesian Experimental Design for Combustion Kinetics

    KAUST Repository

    Huan, Xun

    2011-01-04

    Experimental diagnostics play an essential role in the development and refinement of chemical kinetic models, whether for the combustion of common complex hydrocarbons or of emerging alternative fuels. Questions of experimental design—e.g., which variables or species to interrogate, at what resolution and under what conditions—are extremely important in this context, particularly when experimental resources are limited. This paper attempts to answer such questions in a rigorous and systematic way. We propose a Bayesian framework for optimal experimental design with nonlinear simulation-based models. While the framework is broadly applicable, we use it to infer rate parameters in a combustion system with detailed kinetics. The framework introduces a utility function that reflects the expected information gain from a particular experiment. Straightforward evaluation (and maximization) of this utility function requires Monte Carlo sampling, which is infeasible with computationally intensive models. Instead, we construct a polynomial surrogate for the dependence of experimental observables on model parameters and design conditions, with the help of dimension-adaptive sparse quadrature. Results demonstrate the efficiency and accuracy of the surrogate, as well as the considerable effectiveness of the experimental design framework in choosing informative experimental conditions.

  18. Integrating statistical predictions and experimental verifications for enhancing protein-chemical interaction predictions in virtual screening.

    Directory of Open Access Journals (Sweden)

    Nobuyoshi Nagamine

    2009-06-01

    Full Text Available Predictions of interactions between target proteins and potential leads are of great benefit in the drug discovery process. We present a comprehensively applicable statistical prediction method for interactions between any proteins and chemical compounds, which requires only protein sequence data and chemical structure data and utilizes the statistical learning method of support vector machines. In order to realize reasonable comprehensive predictions which can involve many false positives, we propose two approaches for reduction of false positives: (i efficient use of multiple statistical prediction models in the framework of two-layer SVM and (ii reasonable design of the negative data to construct statistical prediction models. In two-layer SVM, outputs produced by the first-layer SVM models, which are constructed with different negative samples and reflect different aspects of classifications, are utilized as inputs to the second-layer SVM. In order to design negative data which produce fewer false positive predictions, we iteratively construct SVM models or classification boundaries from positive and tentative negative samples and select additional negative sample candidates according to pre-determined rules. Moreover, in order to fully utilize the advantages of statistical learning methods, we propose a strategy to effectively feedback experimental results to computational predictions with consideration of biological effects of interest. We show the usefulness of our approach in predicting potential ligands binding to human androgen receptors from more than 19 million chemical compounds and verifying these predictions by in vitro binding. Moreover, we utilize this experimental validation as feedback to enhance subsequent computational predictions, and experimentally validate these predictions again. This efficient procedure of the iteration of the in silico prediction and in vitro or in vivo experimental verifications with the sufficient

  19. Statistical Design Model (SDM) of satellite thermal control subsystem

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  20. Effects of different building blocks designs on the statistical ...

    African Journals Online (AJOL)

    Tholang T. Mokhele

    South African Journal of Geomatics, Vol. 6. No. 2, Geomatics Indaba 2017 Special Edition, August 2017. 155. Effects of different building blocks designs on the statistical characteristics of Automated. Zone-design Tool output areas. Mokhele Ta1, Mutanga Ob and Ahmed Fc. aResearch Methodology and Data Centre, ...

  1. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  2. Experimental design research approaches, perspectives, applications

    CERN Document Server

    Stanković, Tino; Štorga, Mario

    2016-01-01

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  3. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  4. Fully Bayesian Experimental Design for Pharmacokinetic Studies

    Directory of Open Access Journals (Sweden)

    Elizabeth G. Ryan

    2015-03-01

    Full Text Available Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future dataset drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature, which rapidly obtains samples from the posterior, is importance sampling, using the prior as the importance distribution. However, importance sampling from the prior will tend to break down if there is a reasonable number of experimental observations. In this paper, we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study, which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times that produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.

  5. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  6. Chemical-Based Formulation Design: Virtual Experimentation

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul

    of databases with data of AIs used in different products (such as insect repellents), solvents classified in terms of special characteristics (such as solubility in water), and additives classified in terms of their application (such as aroma agents, wetting agents and preservatives). In addition, the software......This paper presents a software, the virtual Product-Process Design laboratory (virtual PPD-lab) and the virtual experimental scenarios for design/verification of consumer oriented liquid formulated products where the software can be used. For example, the software can be employed for the design......; the addition of the missing chemicals to an incomplete formulation and the verification of the final product. The software is based on a framework that allows quick implementation of different design/verification work-flows and their associated models, methods, tools and data. The software contains a suite...

  7. Statistical fatigue experiment design in medium density fiberboard

    Directory of Open Access Journals (Sweden)

    Martínez Espinosa Mariano

    2000-01-01

    Full Text Available Medium Density Fiberboard (MDF is a wood-based composite widely employed in several industrial applications, in addition to its use in structures subjected to dynamic loads. Its fatigue-related aspects, however, have been consistently ignored. This work proposes to study fatigue in MDF, including the following factors: the basic concepts of MDF and fatigue and the statistical design of fatigue experiments in MDF, with the purpose of obtaining accurate information for analysis by means of statistical methods. The results of our tests revealed that the statistical model is suitable to fit the number of cycles in intermediary S and f levels and to determine the levels of the factors that maximize the total number of cycles to failure. It was also found that the proposed design is of great practical interest for fatigue strength in the tension in wood and wood derivatives.

  8. Statistical Applications and Quantitative Design for Injury Prevention ...

    African Journals Online (AJOL)

    editor of the International Journal of Injury Control and Safety Promotion, conducted a five-day workshop on “Statistical applications and quantitative design for injury prevention research” from 18–21 August 2008 at the MRC in Cape Town, South Africa. The target audience for this workshop was researchers (with some ...

  9. Designing Statistical Language Learners: Experiments on Noun Compounds

    Science.gov (United States)

    Lauer, Mark

    1996-09-01

    The goal of this thesis is to advance the exploration of the statistical language learning design space. In pursuit of that goal, the thesis makes two main theoretical contributions: (i) it identifies a new class of designs by specifying an architecture for natural language analysis in which probabilities are given to semantic forms rather than to more superficial linguistic elements; and (ii) it explores the development of a mathematical theory to predict the expected accuracy of statistical language learning systems in terms of the volume of data used to train them. The theoretical work is illustrated by applying statistical language learning designs to the analysis of noun compounds. Both syntactic and semantic analysis of noun compounds are attempted using the proposed architecture. Empirical comparisons demonstrate that the proposed syntactic model is significantly better than those previously suggested, approaching the performance of human judges on the same task, and that the proposed semantic model, the first statistical approach to this problem, exhibits significantly better accuracy than the baseline strategy. These results suggest that the new class of designs identified is a promising one. The experiments also serve to highlight the need for a widely applicable theory of data requirements.

  10. Design and Statistics in Quantitative Translation (Process) Research

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product.......Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...

  11. Accelerating experimental high-order spatial statistics calculations using GPUs

    Science.gov (United States)

    Li, Xue; Huang, Tao; Lu, De-Tang; Niu, Cong

    2014-09-01

    High-order spatial statistics have been widely used to describe the spatial phenomena in the field of geology science. Spatial statistics are subject to extremely heavy computational burden for large geostatistical models. To improve the computational efficiency, a parallel approach based on GPU (Graphics Processing Unit) is proposed for the calculation of high-order spatial statistics. The parallel scheme is achieved by utilizing a two-stage method to calculate the replicate of a moment for a given template simultaneously termed as the node-stage parallelism, and transform the spatial moments to cumulants for all lags of a template simultaneously termed as the template-stage parallelism. Also, a series of optimization strategies are proposed to take full advantage of the computational capabilities of GPUs, including the appropriate task allocation to the CUDA (Compute Unified Device Architecture) threads, proper organization of the GPU physical memory, and optimal improvement of the existed parallel routines. Tests are carried out on two training images to compare the performance of the GPU-based method with that of the serial implementation. Error analysis results indicate that the proposed parallel method can generate accurate cumulant maps, and the performance comparisons on various examples show that all the speedups for third-order, fourth-order and fifth-order cumulants calculation are over 17 times.

  12. Study designs and statistical methods in the Journal of Family and Community Medicine: 1994-2010.

    Science.gov (United States)

    Aljoudi, Abdullah S

    2013-01-01

    The Journal of Family and Community Medicine (JFCM) is the official peer reviewed scientific publication of the Saudi Society of Family and Community Medicine. Unlike many peer medical journals, the contents of JFCM, have never been analyzed. The objective of this study was to perform an analysis of the contents of the JFCM over a 16-year period to discern the study designs and statistical methods used with a view to improving future contents of the journal. All volumes of the JFCM, from 1 January 1994 to 31 December 2010 were hand searched for research articles. All papers identified as original articles were selected. For every article, the study designs and the statistical methods used were recorded. Articles were then classified according to their statistical methods and study designs. The frequency of study designs was calculated as a simple percentage of the total number of articles, while the frequency of statistical methods was calculated as a percentage of articles that used those statistical methods. A total of 229 articles were analyzed. Of these, 66 (28.8%) either reported no statistics or reported simple summaries. The cross-sectional design was used in 175 (76.4%) of all analyzed articles. Statistical methods were used in 163 (71.2%) articles. Chi-squared test was used in 111 (68.1%) articles, and t-test used in 48 (29.4%) articles. Other common statistical tests were: Regression, which was used in 35 (21.5%) articles, ANOVA used in 23 (14.1%) articles, and odds ratio and relative risk tests which were used in 22 (13.5%) articles. The JFCM has a wide range of study designs and statistical methods. However, no article on experimental studies has been published in the JFCM since its inception.

  13. Enhanced surrogate models for statistical design exploiting space mapping technology

    DEFF Research Database (Denmark)

    Koziel, Slawek; Bandler, John W.; Mohamed, Achmed S.

    2005-01-01

    We present advances in microwave and RF device modeling exploiting Space Mapping (SM) technology. We propose new SM modeling formulations utilizing input mappings, output mappings, frequency scaling and quadratic approximations. Our aim is to enhance circuit models for statistical analysis...... and yield-driven design. We illustrate our results using a capacitively-loaded two-section impedance transformer, a single-resonator waveguide filter and a six-section H-plane waveguide filter....

  14. Fast Bayesian optimal experimental design and its applications

    KAUST Repository

    Long, Quan

    2015-01-07

    We summarize our Laplace method and multilevel method of accelerating the computation of the expected information gain in a Bayesian Optimal Experimental Design (OED). Laplace method is a widely-used method to approximate an integration in statistics. We analyze this method in the context of optimal Bayesian experimental design and extend this method from the classical scenario, where a single dominant mode of the parameters can be completely-determined by the experiment, to the scenarios where a non-informative parametric manifold exists. We show that by carrying out this approximation the estimation of the expected Kullback-Leibler divergence can be significantly accelerated. While Laplace method requires a concentration of measure, multi-level Monte Carlo method can be used to tackle the problem when there is a lack of measure concentration. We show some initial results on this approach. The developed methodologies have been applied to various sensor deployment problems, e.g., impedance tomography and seismic source inversion.

  15. Statistical issues in the design and planning of proteomic profiling experiments.

    Science.gov (United States)

    Cairns, David A

    2015-01-01

    The statistical design of a clinical proteomics experiment is a critical part of well-undertaken investigation. Standard concepts from experimental design such as randomization, replication and blocking should be applied in all experiments, and this is possible when the experimental conditions are well understood by the investigator. The large number of proteins simultaneously considered in proteomic discovery experiments means that determining the number of required replicates to perform a powerful experiment is more complicated than in simple experiments. However, by using information about the nature of an experiment and making simple assumptions this is achievable for a variety of experiments useful for biomarker discovery and initial validation.

  16. Fish Attraction Devices (FADs and experimental designs

    Directory of Open Access Journals (Sweden)

    Michael J. Kingsford

    1999-12-01

    Full Text Available There is widespread use of fish attraction devices (FADs in commercial fisheries and research. Investigations on the utility of FADs to catch fishes, and factors influencing fishes associated with FADs, require careful consideration of experimental designs. The development of appropriate models, from observations and the literature, should be developed before hypotheses can be tested with robust sampling designs. Robust sampling designs may only be possible if investigators have some role in the planning stage of deploying FADs. If the objective of the study is to determine the influence of FADs on assemblages of fishes, then experimenters need to consider that a `FAD-effect´ (=impact cannot be demonstrated without controls. Some preliminary studies may be required to determine the spatial extent of a FAD-effect before suitable sites can be chosen for controls. Other controls may also be necessary, depending on the method used to estimate numbers of fishes (e.g. controls for disturbance. Recent advances in sampling designs that are applicable to impact studies are discussed. Beyond-BACI (Before After Control Impact and MBACI (Multiple BACI designs are recommended because they cater for temporal and spatial variation in the abundance of organisms, which is generally great for pelagic fishes. The utility of orthogonal sampling designs is emphasised as a means of elucidating the influence of multiple factors and, importantly, interactions between them. Further, nested analyses are suggested to deal with multiple temporal and/or spatial time scales in sampling designs. The independence of replicate FADs should also be considered. Problems of independence include: FADs that are connected, thus providing potential routes of movement of associated fishes; temporal dependence where the number of fish at a time influences the number at the next time due to fish becoming residents; and the fact that the proximity of other FADs may influence numbers of

  17. Design and statistical analysis of oral medicine studies: common pitfalls.

    Science.gov (United States)

    Baccaglini, L; Shuster, J J; Cheng, J; Theriaque, D W; Schoenbach, V J; Tomar, S L; Poole, C

    2010-04-01

    A growing number of articles are emerging in the medical and statistics literature that describe epidemiologic and statistical flaws of research studies. Many examples of these deficiencies are encountered in the oral, craniofacial, and dental literature. However, only a handful of methodologic articles have been published in the oral literature warning investigators of potential errors that may arise early in the study and that can irreparably bias the final results. In this study, we briefly review some of the most common pitfalls that our team of epidemiologists and statisticians has identified during the review of submitted or published manuscripts and research grant applications. We use practical examples from the oral medicine and dental literature to illustrate potential shortcomings in the design and analysis of research studies, and how these deficiencies may affect the results and their interpretation. A good study design is essential, because errors in the analysis can be corrected if the design was sound, but flaws in study design can lead to data that are not salvageable. We recommend consultation with an epidemiologist or a statistician during the planning phase of a research study to optimize study efficiency, minimize potential sources of bias, and document the analytic plan.

  18. Set membership experimental design for biological systems

    Directory of Open Access Journals (Sweden)

    Marvel Skylar W

    2012-03-01

    Full Text Available Abstract Background Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. Results In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. Conclusions The practicability of our approach is illustrated with a case study. This

  19. Applying Statistical Design to Control the Risk of Over-Design with Stochastic Simulation

    Directory of Open Access Journals (Sweden)

    Yi Wu

    2010-02-01

    Full Text Available By comparing a hard real-time system and a soft real-time system, this article elicits the risk of over-design in soft real-time system designing. To deal with this risk, a novel concept of statistical design is proposed. The statistical design is the process accurately accounting for and mitigating the effects of variation in part geometry and other environmental conditions, while at the same time optimizing a target performance factor. However, statistical design can be a very difficult and complex task when using clas-sical mathematical methods. Thus, a simulation methodology to optimize the design is proposed in order to bridge the gap between real-time analysis and optimization for robust and reliable system design.

  20. Statistical Design of Genetic Algorithms for Combinatorial Optimization Problems

    Directory of Open Access Journals (Sweden)

    Moslem Shahsavar

    2011-01-01

    Full Text Available Many genetic algorithms (GA have been applied to solve different NP-complete combinatorial optimization problems so far. The striking point of using GA refers to selecting a combination of appropriate patterns in crossover, mutation, and and so forth and fine tuning of some parameters such as crossover probability, mutation probability, and and so forth. One way to design a robust GA is to select an optimal pattern and then to search for its parameter values using a tuning procedure. This paper addresses a methodology to both optimal pattern selection and the tuning phases by taking advantage of design of experiments and response surface methodology. To show the performances of the proposed procedure and demonstrate its applications, it is employed to design a robust GA to solve a project scheduling problem. Through the statistical comparison analyses between the performances of the proposed method and an existing GA, the effectiveness of the methodology is shown.

  1. The Impact of Statistical Leakage Models on Design Yield Estimation

    Directory of Open Access Journals (Sweden)

    Rouwaida Kanj

    2011-01-01

    Full Text Available Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling.

  2. Design sensitivity and statistical power in acceptability judgment experiments

    Directory of Open Access Journals (Sweden)

    Jon Sprouse

    2017-02-01

    Full Text Available Previous investigations into the validity of acceptability judgment data have focused almost exclusively on 'type I errors '(or false positives because of the consequences of such errors for syntactic theories (Sprouse & Almeida 2012; Sprouse et al. 2013. The current study complements these previous studies by systematically investigating the 'type II error rate '(false negatives, or equivalently, the 'statistical power', of a wide cross-section of possible acceptability judgment experiments. Though type II errors have historically been assumed to be less costly than type I errors, the dynamics of scientific publishing mean that high type II error rates (i.e., studies with low statistical power can lead to increases in type I error rates in a given field of study. We present a set of experiments and resampling simulations to estimate statistical power for four tasks (forced-choice, Likert scale, magnitude estimation, and yes-no, 50 effect sizes instantiated by real phenomena, sample sizes from 5 to 100 participants, and two approaches to statistical analysis (null hypothesis and Bayesian. Our goals are twofold (i to provide a fuller picture of the status of acceptability judgment data in syntax, and (ii to provide detailed information that syntacticians can use to design and evaluate the sensitivity of acceptability judgment experiments in their own research.

  3. Statistical designs and response surface techniques for the optimization of chromatographic systems.

    Science.gov (United States)

    Ferreira, Sergio Luis Costa; Bruns, Roy Edward; da Silva, Erik Galvão Paranhos; Dos Santos, Walter Nei Lopes; Quintella, Cristina Maria; David, Jorge Mauricio; de Andrade, Jailson Bittencourt; Breitkreitz, Marcia Cristina; Jardim, Isabel Cristina Sales Fontes; Neto, Benicio Barros

    2007-07-27

    This paper describes fundamentals and applications of multivariate statistical techniques for the optimization of chromatographic systems. The surface response methodologies: central composite design, Doehlert matrix and Box-Behnken design are discussed and applications of these techniques for optimization of sample preparation steps (extractions) and determination of experimental conditions for chromatographic separations are presented. The use of mixture design for optimization of mobile phases is also related. An optimization example involving a real separation process is exhaustively described. A discussion about model validation is presented. Some applications of other multivariate techniques for optimization of chromatographic methods are also summarized.

  4. A statistical design for testing apomictic diversification through linkage analysis.

    Science.gov (United States)

    Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling

    2014-03-01

    The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.

  5. Experimental design data for the biosynthesis of citric acid using Central Composite Design method.

    Science.gov (United States)

    Kola, Anand Kishore; Mekala, Mallaiah; Goli, Venkat Reddy

    2017-06-01

    In the present investigation, we report that statistical design and optimization of significant variables for the microbial production of citric acid from sucrose in presence of filamentous fungi A. niger NCIM 705. Various combinations of experiments were designed with Central Composite Design (CCD) of Response Surface Methodology (RSM) for the production of citric acid as a function of six variables. The variables are; initial sucrose concentration, initial pH of medium, fermentation temperature, incubation time, stirrer rotational speed, and oxygen flow rate. From experimental data, a statistical model for this process has been developed. The optimum conditions reported in the present article are initial concentration of sucrose of 163.6 g/L, initial pH of medium 5.26, stirrer rotational speed of 247.78 rpm, incubation time of 8.18 days, fermentation temperature of 30.06 °C and flow rate of oxygen of 1.35 lpm. Under optimum conditions the predicted maximum citric acid is 86.42 g/L. The experimental validation carried out under the optimal values and reported citric acid to be 82.0 g/L. The model is able to represent the experimental data and the agreement between the model and experimental data is good.

  6. Experimental design data for the biosynthesis of citric acid using Central Composite Design method

    Directory of Open Access Journals (Sweden)

    Anand Kishore Kola

    2017-06-01

    Full Text Available In the present investigation, we report that statistical design and optimization of significant variables for the microbial production of citric acid from sucrose in presence of filamentous fungi A. niger NCIM 705. Various combinations of experiments were designed with Central Composite Design (CCD of Response Surface Methodology (RSM for the production of citric acid as a function of six variables. The variables are; initial sucrose concentration, initial pH of medium, fermentation temperature, incubation time, stirrer rotational speed, and oxygen flow rate. From experimental data, a statistical model for this process has been developed. The optimum conditions reported in the present article are initial concentration of sucrose of 163.6 g/L, initial pH of medium 5.26, stirrer rotational speed of 247.78 rpm, incubation time of 8.18 days, fermentation temperature of 30.06 °C and flow rate of oxygen of 1.35 lpm. Under optimum conditions the predicted maximum citric acid is 86.42 g/L. The experimental validation carried out under the optimal values and reported citric acid to be 82.0 g/L. The model is able to represent the experimental data and the agreement between the model and experimental data is good.

  7. Plant metabolomics: from experimental design to knowledge extraction.

    Science.gov (United States)

    Rai, Amit; Umashankar, Shivshankar; Swarup, Sanjay

    2013-01-01

    Metabolomics is one of the most recent additions to the functional genomics approaches. It involves the use of analytical chemistry techniques to provide high-density data of metabolic profiles. Data is then analyzed using advanced statistics and databases to extract biological information, thus providing the metabolic phenotype of an organism. Large variety of metabolites produced by plants through the complex metabolic networks and their dynamic changes in response to various perturbations can be studied using metabolomics. Here, we describe the basic features of plant metabolic diversity and analytical methods to describe this diversity, which includes experimental workflows starting from experimental design, sample preparation, hardware and software choices, combined with knowledge extraction methods. Finally, we describe a scenario for using these workflows to identify differential metabolites and their pathways from complex biological samples.

  8. D-Optimal Experimental Design for Contaminant Source Identification

    Science.gov (United States)

    Sai Baba, A. K.; Alexanderian, A.

    2016-12-01

    Contaminant source identification seeks to estimate the release history of a conservative solute given point concentration measurements at some time after the release. This can be mathematically expressed as an inverse problem, with a linear observation operator or a parameter-to-observation map, which we tackle using a Bayesian approach. Acquisition of experimental data can be laborious and expensive. The goal is to control the experimental parameters - in our case, the sparsity of the sensors, to maximize the information gain subject to some physical or budget constraints. This is known as optimal experimental design (OED). D-optimal experimental design seeks to maximize the expected information gain, and has long been considered the gold standard in the statistics community. Our goal is to develop scalable methods for D-optimal experimental designs involving large-scale PDE constrained problems with high-dimensional parameter fields. A major challenge for the OED, is that a nonlinear optimization algorithm for the D-optimality criterion requires repeated evaluation of objective function and gradient involving the determinant of large and dense matrices - this cost can be prohibitively expensive for applications of interest. We propose novel randomized matrix techniques that bring down the computational costs of the objective function and gradient evaluations by several orders of magnitude compared to the naive approach. The effect of randomized estimators on the accuracy and the convergence of the optimization solver will be discussed. The features and benefits of our new approach will be demonstrated on a challenging model problem from contaminant source identification involving the inference of the initial condition from spatio-temporal observations in a time-dependent advection-diffusion problem.

  9. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  10. Two-stage microbial community experimental design.

    Science.gov (United States)

    Tickle, Timothy L; Segata, Nicola; Waldron, Levi; Weingart, Uri; Huttenhower, Curtis

    2013-12-01

    Microbial community samples can be efficiently surveyed in high throughput by sequencing markers such as the 16S ribosomal RNA gene. Often, a collection of samples is then selected for subsequent metagenomic, metabolomic or other follow-up. Two-stage study design has long been used in ecology but has not yet been studied in-depth for high-throughput microbial community investigations. To avoid ad hoc sample selection, we developed and validated several purposive sample selection methods for two-stage studies (that is, biological criteria) targeting differing types of microbial communities. These methods select follow-up samples from large community surveys, with criteria including samples typical of the initially surveyed population, targeting specific microbial clades or rare species, maximizing diversity, representing extreme or deviant communities, or identifying communities distinct or discriminating among environment or host phenotypes. The accuracies of each sampling technique and their influences on the characteristics of the resulting selected microbial community were evaluated using both simulated and experimental data. Specifically, all criteria were able to identify samples whose properties were accurately retained in 318 paired 16S amplicon and whole-community metagenomic (follow-up) samples from the Human Microbiome Project. Some selection criteria resulted in follow-up samples that were strongly non-representative of the original survey population; diversity maximization particularly undersampled community configurations. Only selection of intentionally representative samples minimized differences in the selected sample set from the original microbial survey. An implementation is provided as the microPITA (Microbiomes: Picking Interesting Taxa for Analysis) software for two-stage study design of microbial communities.

  11. A new method to determine the number of experimental data using statistical modeling methods

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jung-Ho; Kang, Young-Jin; Lim, O-Kaung; Noh, Yoojeong [Pusan National University, Busan (Korea, Republic of)

    2017-06-15

    For analyzing the statistical performance of physical systems, statistical characteristics of physical parameters such as material properties need to be estimated by collecting experimental data. For accurate statistical modeling, many such experiments may be required, but data are usually quite limited owing to the cost and time constraints of experiments. In this study, a new method for determining a rea- sonable number of experimental data is proposed using an area metric, after obtaining statistical models using the information on the underlying distribution, the Sequential statistical modeling (SSM) approach, and the Kernel density estimation (KDE) approach. The area metric is used as a convergence criterion to determine the necessary and sufficient number of experimental data to be acquired. The pro- posed method is validated in simulations, using different statistical modeling methods, different true models, and different convergence criteria. An example data set with 29 data describing the fatigue strength coefficient of SAE 950X is used for demonstrating the performance of the obtained statistical models that use a pre-determined number of experimental data in predicting the probability of failure for a target fatigue life.

  12. Autonomous entropy-based intelligent experimental design

    Science.gov (United States)

    Malakar, Nabin Kumar

    2011-07-01

    The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same

  13. Web Based Learning Support for Experimental Design in Molecular Biology.

    Science.gov (United States)

    Wilmsen, Tinri; Bisseling, Ton; Hartog, Rob

    An important learning goal of a molecular biology curriculum is a certain proficiency level in experimental design. Currently students are confronted with experimental approaches in textbooks, in lectures and in the laboratory. However, most students do not reach a satisfactory level of competence in the design of experimental approaches. This…

  14. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  15. Simulated statistics of polydisperse sedimenting inertial particles in a turbulent flow under experimental conditions

    Science.gov (United States)

    Wang, Lian-Ping; Parishani, Hossein; Rosa, Bogdan; Bateson, Colin; Aliseda, Alberto; Grabowski, Wojciech

    2009-11-01

    In recent years, point-particle based or hybrid direct numerical simulations (DNS) have increasingly been used to study pair statistics of inertial particles relevant to turbulent collision of cloud droplets. Equivalent experiment data are rare but are slowly becoming available. In this talk, we will discuss simulated statistics of sedimenting inertial particles under conditions similar to our parallel wind-tunnel experiment (to be reported here by Bateson et al.). The key parameters to be matched are flow Reynolds number, dissipation rate, particle Stokes number, and dimensionless settling velocity. A prescribed droplet size distribution will be used in the simulation to reproduce the polydisperse condition in the experiment. High-resolution DNS will be used to maximize the computational domain size. Single-particle and particle-pair statistics (e.g., fluctuation velocities, radial distribution function, relative velocity statistics) will be compared to the experimental data. Statistics obtained from lower dimensions will be linked to statistics in three dimensions.

  16. The rat bone marrow micronucleus test--study design and statistical power.

    Science.gov (United States)

    Hayes, Julie; Doherty, Ann T; Adkins, Deborah J; Oldman, Karen; O'Donovan, Michael R

    2009-09-01

    Although the rodent bone marrow micronucleus test has been in routine use for over 20 years, little work has been published to support its experimental design and all this has used the mouse rather than the rat. When it was decided to change the strain of rat routinely used in this laboratory to the Han Wistar, a preliminary study was performed to investigate the possible factors influencing experimental variability and to use statistical tools to examine possible study designs. Subsequently, a historical database comprising of vehicle controls accumulated from 65 studies was used to establish test acceptance criteria and a strategy for analysing equivocal results. The following conclusions were made: (i) no statistically significant differences were observed in experimental variability within or between control animals; although not statistically significant, the majority of experimental variability seen was found to be between separate counts on the same slide, with minimal differences found between duplicate slides from the same rat or between individual rats; (ii) power analyses showed that, if an equivocal result is obtained after scoring 2000 immature erythrocytes (IE), it is appropriate to re-code the slides and score an additional 4000 IE, i.e. analysing a total of 6000 IE; no meaningful increase in statistical power is gained by scoring >6000 IE; this is consistent with the variability observed between separate counts on the same slide; (iii) there was no significant difference between the control micronucleated immature erythrocyte (MIE) values at 24 and 48 h after dosing or between males and females; therefore, if an unusually low control value at either time point results in apparent small increases in MIE in a treated group, it is valid to pool control values from both time points for clarification and (iv) similar statistical power can be achieved by scoring 2000 IE from seven rats or 4000 IE from five rats, respectively. However, this is based only

  17. Developing Statistical Knowledge for Teaching during Design-Based Research

    Science.gov (United States)

    Groth, Randall E.

    2017-01-01

    Statistical knowledge for teaching is not precisely equivalent to statistics subject matter knowledge. Teachers must know how to make statistics understandable to others as well as understand the subject matter themselves. This dual demand on teachers calls for the development of viable teacher education models. This paper offers one such model,…

  18. Fast Bayesian Optimal Experimental Design for Seismic Source Inversion

    KAUST Repository

    Long, Quan

    2016-01-06

    We develop a fast method for optimally designing experiments [1] in the context of statistical seismic source inversion [2]. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by the elastic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the true parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  19. Fast Bayesian optimal experimental design for seismic source inversion

    KAUST Repository

    Long, Quan

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem. © 2015 Elsevier B.V.

  20. Development of the Biological Experimental Design Concept Inventory (BEDCI)

    OpenAIRE

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2014-01-01

    Interest in student conception of experimentation inspired the development of a fully validated 14-question inventory on experimental design in biology (BEDCI) by following established best practices in concept inventory (CI) design. This CI can be used to diagnose specific examples of non–expert-like thinking in students and to evaluate the success of teaching strategies that target conceptual changes. We used BEDCI to diagnose non–expert-like student thinking in experimental design at the p...

  1. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  2. Conceptual design report, CEBAF basic experimental equipment

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-04-13

    The Continuous Electron Beam Accelerator Facility (CEBAF) will be dedicated to basic research in Nuclear Physics using electrons and photons as projectiles. The accelerator configuration allows three nearly continuous beams to be delivered simultaneously in three experimental halls, which will be equipped with complementary sets of instruments: Hall A--two high resolution magnetic spectrometers; Hall B--a large acceptance magnetic spectrometer; Hall C--a high-momentum, moderate resolution, magnetic spectrometer and a variety of more dedicated instruments. This report contains a short description of the initial complement of experimental equipment to be installed in each of the three halls.

  3. Procedure for statistical analysis of one-parameter discrepant experimental data.

    Science.gov (United States)

    Badikov, Sergey A; Chechev, Valery P

    2012-09-01

    A new, Mandel-Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Design of Formulated Products: Experimental Component

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Cheng, Y. S.

    2012-01-01

    A systematic methodology for the design and verification of chemical-based products is proposed. By integrating modeling, and experiments, the search space is efficiently scanned to identify the feasible product candidates. The product design (or verification) problem consists of three stages: co...... and a sunscreen lotion....

  5. The Application of Statistical Design of Experiments to Study the In-Plane Shear Behaviour of Hybrid Composite Sandwich Panel

    Directory of Open Access Journals (Sweden)

    Fajrin J.

    2016-03-01

    Full Text Available This paper presents a statistical aspect of experimental study on the in-plane shear behaviour of hybrid composite sandwich panel with intermediate layer. The study was aimed at providing information of how significant the contribution of intermediate layer to the in-plane shear behaviour of new developed sandwich panel. The investigation was designed as a single factor experimental design and the results were throughly analysed with statistics software; Minitab 15. The panels were tested by applying a tensile force along the diagonal of the test frame simulating pure shear using a 100 kN MTS servo-hydraulic UTM. The result shows that the incorporation of intermediate layer has sinificantly enhanced the in-plane shear behaviour of hybrid composite sandwich panel. The statistical analysis shows that the value of F0 is much higher than the value of Ftable, which has a meaning that the improvement provided by the incorporation of intermediate layer is statistically significant.

  6. Experimental Validation of an Integrated Controls-Structures Design Methodology

    Science.gov (United States)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  7. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  8. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  9. Autism genetics: Methodological issues and experimental design.

    Science.gov (United States)

    Sacco, Roberto; Lintas, Carla; Persico, Antonio M

    2015-10-01

    Autism is a complex neuropsychiatric disorder of developmental origin, where multiple genetic and environmental factors likely interact resulting in a clinical continuum between "affected" and "unaffected" individuals in the general population. During the last two decades, relevant progress has been made in identifying chromosomal regions and genes in linkage or association with autism, but no single gene has emerged as a major cause of disease in a large number of patients. The purpose of this paper is to discuss specific methodological issues and experimental strategies in autism genetic research, based on fourteen years of experience in patient recruitment and association studies of autism spectrum disorder in Italy.

  10. Human Factors Experimental Design and Analysis Reference

    Science.gov (United States)

    2007-07-01

    8.75 83.36 8 =where, tObserved YPD = 53.75 YSD = 62.00 3.6.5. Within-Subjects t-Test (Cont’d) (Click in this red rectangle to see SAS calculations...expands computer experience into 3 levels, High, Medium , and Low, as compared to the original example that used only 2 levels of computer experience...High and Low. In this example, each of the 80 subjects who used the experimental text editor rated their computer experience as high, medium , or

  11. Study on Development of GeopolymerBinder from Terracotta Roof Tile Waste by Experimental and Statistical Method

    Science.gov (United States)

    Usha, S.; Nair, Deepa G.; Vishnudas, Subha

    2017-08-01

    Alkali activation of aluminosilicate materials produces geopolymers at ambient or slightly elevated temperatures by geopolymerization process. The reaction product is an amorphous aluminosilicate gel having a structure similar to that of zeolitic precursors. In this paper study on development of geopolymer binder from terracotta tile waste was carried out through experimental and statistical analysis. Compressive strength test was conducted with sixteen combinations of specimens by varying four identified parameters. A 24 full factorial design was adopted for the experimental study with each parameter having two levels. The significance of parameter effect on geopolymerization has been investigated using full factorial design. The influence of the parameters such as molarity of sodium hydroxide, alkali activator to binder ratio and elevated curing temperature on geopolymerization are found significant where as influence of sodium silicate to sodium hydroxide solution ratio was insignificant at 5% level of significance. The interaction effect among the parameters are studied and found to be negligible.

  12. Split-plot designs for multistage experimentation

    DEFF Research Database (Denmark)

    Kulahci, Murat; Tyssedal, John

    2016-01-01

    at the same time will be more efficient. However, there have been only a few attempts in the literature to provide an adequate and easy-to-use approach for this problem. In this paper, we present a novel methodology for constructing two-level split-plot and multistage experiments. The methodology is based...... be accommodated in each stage. Furthermore, split-plot designs for multistage experiments with good projective properties are also provided....

  13. Analyzing Data from a Pretest-Posttest Control Group Design: The Importance of Statistical Assumptions

    Science.gov (United States)

    Zientek, Linda; Nimon, Kim; Hammack-Brown, Bryn

    2016-01-01

    Purpose: Among the gold standards in human resource development (HRD) research are studies that test theoretically developed hypotheses and use experimental designs. A somewhat typical experimental design would involve collecting pretest and posttest data on individuals assigned to a control or experimental group. Data from such a design that…

  14. Independence and statistical inference in clinical trial designs: a tutorial review.

    Science.gov (United States)

    Bolton, S

    1998-05-01

    The requirements for statistical approaches to the design, analysis, and interpretation of experimental data are now accepted by the scientific community. This is of particular importance in medical studies where public health consequences are of concern. Investigators in the clinical sciences should be cognizant of statistical principles in general, but should always be wary of the pursuing their own analyses and engage statisticians for data analysis whenever possible. Examples of circumstances that require statistical evaluation not found in textbooks and not always obvious to the lay person are pervasive. Incorrect statistical evaluation and analyses in such situations will result in erroneous and potentially serious misleading interpretation of clinical data. Although a statistician may not be responsible for any misinterpretations in such unfortunate circumstances, the quote often cited about statisticians and "damned liars" may appear to be more truth than fable. This article is a tutorial review and describes a common misuse of clinical data resulting in an apparently large sample size derived from a small number of patients. This mistake is a consequence of ignoring the dependency of results, treating multiple observations from a single patient as independent observations.

  15. Statistical design and evaluation of a propranolol HCl gastric floating tablet

    Directory of Open Access Journals (Sweden)

    Meka Venkata Srikanth

    2012-02-01

    Full Text Available The purpose of this research was to apply statistical design for the preparation of a gastric floating tablet (GFT of propranolol HCl and to investigate the effect of formulation variables on drug release and the buoyancy properties of the delivery system. The contents of polyethylene oxide (PEO WSR coagulant and sodium bicarbonate were used as independent variables in central composite design of the best formulation. Main effects and interaction terms of the formulation variables were evaluated quantitatively using a mathematical model approach showing that both independent variables have significant effects on floating lag time, % drug release at 1 h (D1 h and time required to release 90% of the drug (t90. The desired function was used to optimize the response variables, each with a different target, and the observed responses were in good agreement with the experimental values. FTIR and DSC studies of the statistically optimized formulation revealed there was no chemical interaction between drug and polymer. The statistically optimized formulation released drug according to first order kinetics with a non-Fickian diffusion mechanism. Evaluation of the optimized formulation in vivo in human volunteers showed that the GFT was buoyant in gastric fluid and that its gastric residence time was enhanced in the fed but not the fasted state.

  16. On experimentation across science, design and society

    DEFF Research Database (Denmark)

    Boris, Stefan Darlan

    2016-01-01

    The article describes how the principal idea behind the landscape laboratories has been to develop a 1:1 platform where researchers, practitioners and lay people can meet and cooperate on the development and testing of new design concepts for establishing and managing urban landscapes. This is so......The article describes how the principal idea behind the landscape laboratories has been to develop a 1:1 platform where researchers, practitioners and lay people can meet and cooperate on the development and testing of new design concepts for establishing and managing urban landscapes....... This is something, which is becoming increasingly relevant, as landscape architects and urban planners today have to address the challenges confronting urbanism due to the continued entanglement of urbanisation and anthropogenic processes. These are challenges where the act of destabilizing dichotomies (inside....../outside, natural/manmade, etc.) is one out of several reasons for not only continuing but also strengthening the landscape laboratories as testing grounds for future urban landscapes and green spaces in the Anthropocene....

  17. Spectral-statistics properties of the experimental and theoretical light meson spectra

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, L., E-mail: laura@nuc5.fis.ucm.es [European Centre for Theoretical Studies in Nuclear Physics and Related Areas (ECT), Villa Tambosi, Strada delle Tabarelle 286, Villazzano, I-38123 (Italy); Fernandez-Ramirez, C., E-mail: cesar@nuc2.fis.ucm.es [European Centre for Theoretical Studies in Nuclear Physics and Related Areas (ECT), Villa Tambosi, Strada delle Tabarelle 286, Villazzano, I-38123 (Italy); Grupo de Fisica Nuclear, Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Ciencias Fisicas, Universidad Complutense de Madrid, Avda. Complutense s/n, E-28040 Madrid (Spain); Relano, A., E-mail: armando.relano@gmail.com [Instituto de Estructura de la Materia, IEM-CSIC, Serrano 123, E-28006 Madrid (Spain); Departamento de Fisica Aplicada I, Facultad de Ciencias Fisicas, Universidad Complutense de Madrid, Avda. Complutense s/n, E-28040 Madrid (Spain); Retamosa, J. [Grupo de Fisica Nuclear, Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Ciencias Fisicas, Universidad Complutense de Madrid, Avda. Complutense s/n, E-28040 Madrid (Spain)

    2012-03-29

    We present a robust analysis of the spectral fluctuations exhibited by the light meson spectrum. This analysis provides information about the degree of chaos in light mesons and may be useful to get some insight on the underlying interactions. Our analysis unveils that the statistical properties of the light meson spectrum are close, but not exactly equal, to those of chaotic systems. In addition, we have analyzed several theoretical spectra including the latest lattice QCD calculation. With a single exception, their statistical properties are close to those of a generic integrable system, and thus incompatible with the experimental spectrum.

  18. Estudos de expressão gênica utilizando-se microarrays: delineamento, análise, e aplicações na pesquisa zootécnica Microarray gene expression studies: experimental design, statistical data analysis, and applications in livestock research

    Directory of Open Access Journals (Sweden)

    Guilherme Jordão de Magalhães Rosa

    2007-07-01

    , and they have been increasingly used also in different areas of livestock research, such as growth and metabolism, reproduction, immune response to diseases and parasites, response to non-infectious stress factors (such as dietary restriction, exposure to toxic elements and other unfavorable environmental conditions as well as animal breeding. Such experiments, however, are still considerably expensive and time consuming and, consequently, they are performed with relatively small sample sizes. Nonetheless, microarray experiments are extremely complex, as they involve a number of laboratorial procedures such as sample collection, RNA extraction, reverse transcription and labeling, and the final hybridization. Hence, microarray assays require careful experimental planning and statistical data analysis. In this manuscript, basic principles of experimental design for microarray studies are reviewed, as well as the most common statistical and computational tools used for their analysis. In addition, some examples of application of microarray technology in animal science are discussed, and some concluding remarks are presented afterward.

  19. ANIMATED DOCUMENTARY: EXPERIMENTATION, TECHNOLOGY AND DESIGN

    OpenAIRE

    INDIA MARA MARTINS

    2009-01-01

    O objetivo desta tese é refletir sobre o documentário animado. Um produto audiovisual que mistura documentário e animação e está redefinindo o papel do design na produção das novas mídias. Mostramos também que o documentário animado reacende uma série de debates e reflexões relativas à teoria do documentário e da animação em relação às concepções de realismo. A nossa principal premissa é que o documentário sempre se apropriou da tecnologia de forma a favorecer a experimentação ...

  20. Experimental observation of fractional Fourier transform for a partially coherent optical beam with Gaussian statistics.

    Science.gov (United States)

    Wang, Fei; Cai, Yangjian

    2007-07-01

    We report the experimental observation of the fractional Fourier transform (FRT) for a partially coherent optical beam with Gaussian statistics [i.e., partially coherent Gaussian Schell-model (GSM) beam]. The intensity distribution (or beam width) and the modulus of the square of the spectral degree of coherence (or coherence width) of a partially coherent GSM beam in the FRT plane are measured, and the experimental results are analyzed and agree well with the theoretical results. The FRT optical system provides a convenient way to control the properties, e.g., the intensity distribution, beam width, spectral degree of coherence, and coherence width, of a partially coherent beam.

  1. Experimental design in analytical chemistry--part I: theory.

    Science.gov (United States)

    Ebrahimi-Najafabadi, Heshmatollah; Leardi, Riccardo; Jalali-Heravi, Mehdi

    2014-01-01

    This paper reviews the main concepts of experimental design applicable to the optimization of analytical chemistry techniques. The critical steps and tools for screening, including Plackett-Burman, factorial and fractional factorial designs, and response surface methodology such as central composite, Box-Behnken, and Doehlert designs, are discussed. Some useful routines are also presented for performing the procedures.

  2. Experimental investigation of local properties and statistics of optical vortices in random wave fields

    DEFF Research Database (Denmark)

    Wang, W.; Hanson, Steen Grüner; Miyamoto, Y.

    2005-01-01

    We present the first direct experimental evidence of the local properties of optical vortices in a random laser speckle field. We have observed the Berry anisotropy ellipse describing the anisotropic squeezing of phase lines close to vortex cores and quantitatively verified the Dennis angular mom...... momentum rule for its phase. Some statistics associated with vortices, such as density, anisotropy ellipse eccentricity, and its relation to zero crossings of real and imaginary parts of the random field, are also investigated by experiments....

  3. Research design and statistics in biomechanics and motor control.

    Science.gov (United States)

    Mullineaux, D R; Bartlett, R M; Bennett, S

    2001-10-01

    Biomechanics and motor control researchers measure how the body moves and interacts with its environment. The aim of this review paper is to consider some key issues in research methods in biomechanics and motor control. The review is organized into four sections: proposing, conducting, analysing and reporting research. In the first of these, we emphasize the importance of defining a worthy research question and of planning the study before its implementation to prevent later difficulties in the analysis and interpretation of data. In the second section, we cover selection of trial sizes and suggest that using three trials or more may be beneficial to provide more 'representative' and valid data. The third section on analysis of data concentrates on effect size statistics, qualitative and numerical trend analysis and cross-correlations. As sample sizes are often small, the use of effect size is recommended to support the results of statistical significance testing. In using cross-correlations, we recommend that scatterplots of one variable against the other, with the identified time lag included, be inspected to confirm that the linear relationship assumption underpinning this statistic is met and, if appropriate, that a linearity transformation be applied. Finally, we consider important information related to the issues above that should be included when reporting research. We recommend reporting checks or corrections for violations of underpinning assumptions, and the effect of these checks or corrections, to assist in advancing knowledge in biomechanics and motor control.

  4. Psychometric evaluation and experimental validation of the statistics anxiety rating scale.

    Science.gov (United States)

    Papousek, Ilona; Ruggeri, Kai; Macher, Daniel; Paechter, Manuela; Heene, Moritz; Weiss, Elisabeth M; Schulter, Günter; Freudenthaler, H Harald

    2012-01-01

    The Statistics Anxiety Rating Scale (STARS) was adapted into German to examine its psychometric properties (n = 400). Two validation studies (n = 66, n = 96) were conducted to examine its criterion-related validity. The psychometric properties of the questionnaire were very similar to those previously reported for the original English version in various countries and other language versions. Confirmatory factor analysis indicated 2 second-order factors: One was more closely related to anxiety and the other was more closely related to negative attitudes toward statistics. Predictive validity of the STARS was shown both in an experimental exam-like situation in the laboratory and during a real examination situation. Taken together, the findings indicate that statistics anxiety as assessed by the STARS is a useful construct that is more than just an expression of a more general disposition to anxiety.

  5. Model-robust experimental designs for the fractional polynomial ...

    African Journals Online (AJOL)

    can give a good a fit to the data and much more plausible behavior between de- sign points than the polynomial models. In this ... often resorted to the theory of optimal designs to get practical designs for those models. However the complexity of these nonlinear ..... on Statistics in Industry and Tech- nology, Dayton, Ohio. 18 ...

  6. Statistical theory of designed quantum transport across disordered networks

    Science.gov (United States)

    Walschaers, Mattia; Mulet, Roberto; Wellens, Thomas; Buchleitner, Andreas

    2015-04-01

    We explain how centrosymmetry, together with a dominant doublet of energy eigenstates in the local density of states, can guarantee interference-assisted, strongly enhanced, strictly coherent quantum excitation transport between two predefined sites of a random network of two-level systems. Starting from a generalization of the chaos-assisted tunnelling mechanism, we formulate a random matrix theoretical framework for the analytical prediction of the transfer time distribution, of lower bounds of the transfer efficiency, and of the scaling behavior of characteristic statistical properties with the size of the network. We show that these analytical predictions compare well to numerical simulations, using Hamiltonians sampled from the Gaussian orthogonal ensemble.

  7. Design Issues and Inference in Experimental L2 Research

    Science.gov (United States)

    Hudson, Thom; Llosa, Lorena

    2015-01-01

    Explicit attention to research design issues is essential in experimental second language (L2) research. Too often, however, such careful attention is not paid. This article examines some of the issues surrounding experimental L2 research and its relationships to causal inferences. It discusses the place of research questions and hypotheses,…

  8. Statistical aspects of quantitative real-time PCR experiment design

    Czech Academy of Sciences Publication Activity Database

    Kitchen, R.R.; Kubista, Mikael; Tichopád, Aleš

    2010-01-01

    Roč. 50, č. 4 (2010), s. 231-236 ISSN 1046-2023 R&D Projects: GA AV ČR IAA500520809 Institutional research plan: CEZ:AV0Z50520701 Keywords : Real-time PCR * Experiment design * Nested analysis of variance Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 4.527, year: 2010

  9. Statistical methods for unidirectional switch designs : Past, present, and future

    NARCIS (Netherlands)

    Zhan, Zhuozhao; de Bock, Geertruida H; van den Heuvel, Edwin R

    2017-01-01

    Clinical trials may apply or use a sequential introduction of a new treatment to determine its efficacy or effectiveness with respect to a control treatment. The reasons for choosing a particular switch design have different origins. For instance, they may be implemented for ethical or logistic

  10. A Bayesian experimental design approach to structural health monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Flynn, Eric [UCSD; Todd, Michael [UCSD

    2010-01-01

    Optimal system design for SHM involves two primarily challenges. The first is the derivation of a proper performance function for a given system design. The second is the development of an efficient optimization algorithm for choosing a design that maximizes, or nearly maximizes the performance function. In this paper we will outline how an SHM practitioner can construct the proper performance function by casting the entire design problem into a framework of Bayesian experimental design. The approach demonstrates how the design problem necessarily ties together all steps of the SHM process.

  11. Experimental Design and Data Analysis of In Vivo Fluorescence Imaging Studies.

    Science.gov (United States)

    Ding, Ying; Lin, Hui-Min

    2016-01-01

    The objective of this chapter is to provide researchers who conduct in vivo fluorescence imaging studies with guidance in statistical aspects in the experimental design and data analysis of such studies. In the first half of this chapter, we introduce the key statistical components for designing a sound in vivo experiment. Particular emphasis is placed on the issues and designs that pertain to fluorescence imaging studies. Examples representing several popular types of fluorescence imaging experiments are provided as case studies to demonstrate how to appropriately design such studies. In the second half of this chapter, we explain the fundamental statistical concepts and methods used in the data analysis of typical in vivo experiments. We also provide specific examples in in vivo imaging studies to illustrate the key steps of analysis procedure.

  12. Statistical and methodological issues in microbicide trial design.

    Science.gov (United States)

    Crook, Angela M; Nunn, Andrew J

    2012-08-01

    Microbicide trials aim to measure the effect of a microbicide in reducing the risk of acquiring human immunodeficiency virus. Such trials present a number of challenging issues from design and conduct through to analysis and reporting. This begins with the initial identification of the target trial population. Prevention trials need to identify those at risk of human immunodeficiency virus infection. This can be more difficult in the general population compared with treatment trials that can target specific patient groups who have a confirmed diagnosis of the disease of interest. Consequently, microbicide trial participants will inevitably be recruited who are never at risk of HIV infection. In this chapter we outline the main features of microbicide trial design, key issues during conduct and analysis, and discuss the challenges specific to these types of clinical trials. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  13. Use of Experimental Design for Peuhl Cheese Process Optimization ...

    African Journals Online (AJOL)

    Use of Experimental Design for Peuhl Cheese Process Optimization. ... Journal of Applied Sciences and Environmental Management ... This work consisting in use of a central composite design enables the determination of optimal process conditions concerning: leaf extract volume added (7 mL), heating temperature ...

  14. Experimental investigation and statistical optimisation of the selective laser melting process of a maraging steel

    Science.gov (United States)

    Casalino, G.; Campanelli, S. L.; Contuzzi, N.; Ludovico, A. D.

    2015-01-01

    Selective Laser Melting (SLM) is an Additive Manufacturing process (AM) that built parts from powder using a layer-by-layer deposition technique. The control of the parameters that influence the melting and the amount of energy density involved in the process is paramount in order to get valuable parts. The objective of this paper is to perform an experimental investigation and a successive statistical optimization of the parameters of the selective laser melting process of the 18Ni300 maraging steel. The experimental investigation involved the study of the microstructure, the mechanical and surface properties of the laser maraging powder. The outcomes of experimental study demonstrated that the hardness, the mechanical strength and the surface roughness correlated positively to the part density. Parts with relative density higher than 99% had a very low porosity that presented closed and regular shaped pores. The statistical optimization determined that the best part properties were produced with the laser power bigger than 90 W and the velocity smaller than 220 mm/s.

  15. Treatment noncompliance in randomized experiments: statistical approaches and design issues.

    Science.gov (United States)

    Sagarin, Brad J; West, Stephen G; Ratnikov, Alexander; Homan, William K; Ritchie, Timothy D; Hansen, Edward J

    2014-09-01

    Treatment noncompliance in randomized experiments threatens the validity of causal inference and the interpretability of treatment effects. This article provides a nontechnical review of 7 approaches: 3 traditional and 4 newer statistical analysis strategies. Traditional approaches include (a) intention-to-treat analysis (which estimates the effects of treatment assignment irrespective of treatment received), (b) as-treated analysis (which reassigns participants to groups reflecting the treatment they actually received), and (c) per-protocol analysis (which drops participants who did not comply with their assigned treatment). Newer approaches include (d) the complier average causal effect (which estimates the effect of treatment on the subpopulation of those who would comply with their assigned treatment), (e) dose-response estimation (which uses degree of compliance to stratify participants, producing an estimate of a dose-response relationship), (f) propensity score analysis (which uses covariates to estimate the probability that individual participants will comply, enabling estimates of treatment effects at different propensities), and (g) treatment effect bounding (which calculates a range of possible treatment effects applicable to both compliers and noncompliers). The discussion considers the areas of application, the quantity estimated, the underlying assumptions, and the strengths and weaknesses of each approach. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  16. Application of Statistical Design for the Production of Cellulase by Trichoderma reesei Using Mango Peel

    Science.gov (United States)

    Saravanan, P.; Muthuvelayudham, R.; Viruthagiri, T.

    2012-01-01

    Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH2PO4, and CoCl2 ·6H2O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM). The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH2PO4: 4.90 g/L, and CoCl2 ·6H2O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL. PMID:23304453

  17. Experimental Methodology in English Teaching and Learning: Method Features, Validity Issues, and Embedded Experimental Design

    Science.gov (United States)

    Lee, Jang Ho

    2012-01-01

    Experimental methods have played a significant role in the growth of English teaching and learning studies. The paper presented here outlines basic features of experimental design, including the manipulation of independent variables, the role and practicality of randomised controlled trials (RCTs) in educational research, and alternative methods…

  18. Design of an experimental set up for convective drying: experimental studies at different drying temperature

    Science.gov (United States)

    Mohan, V. P. Chandra; Talukdar, Prabal

    2013-01-01

    An experimental setup is designed to investigate the convective drying of moist object experimentally. All the design data, components of setup, materials and specifications are presented. Transient moisture content of a rectangular shaped potato slice (4 × 2 × 2 cm) is measured at different air temperatures of 40, 50, 60 and 70 °C with an air velocity of 2 m/s. Two different drying rate periods are observed. Results are compared with available results from literature.

  19. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    Science.gov (United States)

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  20. STATISTICAL ANALYSIS FOR OBJECT ORIENTED DESIGN SOFTWARE SECURITY METRICS

    OpenAIRE

    Amjan.Shaik; Dr.C.R.K.Reddy; Dr.A.Damodaran

    2010-01-01

    In the last decade, empirical studies on object-oriented design metrics have shown some of them to be useful for predicting the fault-proneness of classes in object-oriented software systems. In the era of Computerization Object Oriented Paradigm is becoming more and more pronounced. This has provoked the need of high quality object oriented software, as the traditional metrics cannot be applied on the object-oriented systems. This paper gives the evaluation of CK suit of metrics. There are q...

  1. A statistical method for evaluation of the experimental phase equilibrium data of simple clathrate hydrates

    DEFF Research Database (Denmark)

    Eslamimanesh, Ali; Gharagheizi, Farhad; Mohammadi, Amir H.

    2012-01-01

    We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot, and the r......We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot......, and the residuals of a selected correlation results lead to define the probable outliers. This method not only contributes to outliers diagnostics but also identifies the range of applicability of the applied model and quality of the existing experimental data. The available correlation in the literature...... in exponential form is used to represent/predict the hydrate dissociation pressures for three-phase equilibrium conditions (liquid water/ice–vapor-hydrate). The investigated hydrate formers are methane, ethane, propane, carbon dioxide, nitrogen, and hydrogen sulfide. It is interpreted from the obtained results...

  2. Robust statistical methods: A primer for clinical psychology and experimental psychopathology researchers.

    Science.gov (United States)

    Field, Andy P; Wilcox, Rand R

    2017-11-01

    This paper reviews and offers tutorials on robust statistical methods relevant to clinical and experimental psychopathology researchers. We review the assumptions of one of the most commonly applied models in this journal (the general linear model, GLM) and the effects of violating them. We then present evidence that psychological data are more likely than not to violate these assumptions. Next, we overview some methods for correcting for violations of model assumptions. The final part of the paper presents 8 tutorials of robust statistical methods using R that cover a range of variants of the GLM (t-tests, ANOVA, multiple regression, multilevel models, latent growth models). We conclude with recommendations that set the expectations for what methods researchers submitting to the journal should apply and what they should report. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Statistical modeling of the ultra wide band propagation channel through the analysis of experimental measurements

    Science.gov (United States)

    Pagani, Pascal; Pajusco, Patrice

    2006-09-01

    For the development of future Ultra Wide Band (UWB) communication systems, realistic modeling of the propagation channel is necessary. This article presents an experimental study of the UWB radio channel, based on an extensive sounding campaign covering the indoor office environment. We consider the main characteristics of the UWB channel by studying the propagation loss and wide band parameters, such as the delay spread and the power delay profile decay. From this analysis, we propose a statistical channel model reproducing the UWB channel effects over the frequency bandwidth 3.1-10.6 GHz. To cite this article: P. Pagani, P. Pajusco, C. R. Physique 7 (2006).

  4. Statistical Design for Formulation Optimization of Hydrocortisone Butyrate-Loaded PLGA Nanoparticles

    National Research Council Canada - National Science Library

    Yang, Xiaoyan; Patel, Sulabh; Sheng, Ye; Pal, Dhananjay; Mitra, Ashim K

    2014-01-01

    .... Experimental designs were used to investigate specific effects of independent variables during preparation of HB-loaded PLGA NP and corresponding responses in optimizing the formulation. Plackett...

  5. Topology of Intermetallic Structures: From Statistics to Rational Design.

    Science.gov (United States)

    Akhmetshina, Tatiana G; Blatov, Vladislav A; Proserpio, Davide M; Shevchenko, Alexander P

    2018-01-16

    coordination sphere and looks for structural units as multishell clusters that assemble the whole structure. This approach shows that 41% of intermetallics can be assembled with a single nanocluster and that 22.4% of these are packed according to the face-centered cubic motif of the closest packing of spheres in three-dimensional space. We have shown that our approach can easily adopt any other building model and hence could become a platform for a universal predictive scheme. Within this scheme, all of the structural descriptors can be related to experimental data and theoretical modeling results and then can be used to synthesize new intermetallic compounds and to foresee novel materials.

  6. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    Science.gov (United States)

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  7. Optimal Experimental Design of Furan Shock Tube Kinetic Experiments

    KAUST Repository

    Kim, Daesang

    2015-01-07

    A Bayesian optimal experimental design methodology has been developed and applied to refine the rate coefficients of elementary reactions in Furan combustion. Furans are considered as potential renewable fuels. We focus on the Arrhenius rates of Furan + OH ↔ Furyl-2 + H2O and Furan ↔ OH Furyl-3 + H2O, and rely on the OH consumption rate as experimental observable. A polynomial chaos surrogate is first constructed using an adaptive pseudo-spectral projection algorithm. The PC surrogate is then exploited in conjunction with a fast estimation of the expected information gain in order to determine the optimal design in the space of initial temperatures and OH concentrations.

  8. Design of U-Geometry Parameters Using Statistical Analysis Techniques in the U-Bending Process

    Directory of Open Access Journals (Sweden)

    Wiriyakorn Phanitwong

    2017-06-01

    Full Text Available The various U-geometry parameters in the U-bending process result in processing difficulties in the control of the spring-back characteristic. In this study, the effects of U-geometry parameters, including channel width, bend angle, material thickness, tool radius, as well as workpiece length, and their design, were investigated using a combination of finite element method (FEM simulation, and statistical analysis techniques. Based on stress distribution analyses, the FEM simulation results clearly identified the different bending mechanisms and effects of U-geometry parameters on the spring-back characteristic in the U-bending process, with and without pressure pads. The statistical analyses elucidated that the bend angle and channel width have a major influence in cases with and without pressure pads, respectively. The experiments were carried out to validate the FEM simulation results. Additionally, the FEM simulation results were in agreement with the experimental results, in terms of the bending forces and bending angles.

  9. Robust optimization of the output voltage of nanogenerators by statistical design of experiments

    KAUST Repository

    Song, Jinhui

    2010-09-01

    Nanogenerators were first demonstrated by deflecting aligned ZnO nanowires using a conductive atomic force microscopy (AFM) tip. The output of a nanogenerator is affected by three parameters: tip normal force, tip scanning speed, and tip abrasion. In this work, systematic experimental studies have been carried out to examine the combined effects of these three parameters on the output, using statistical design of experiments. A statistical model has been built to analyze the data and predict the optimal parameter settings. For an AFM tip of cone angle 70° coated with Pt, and ZnO nanowires with a diameter of 50 nm and lengths of 600 nm to 1 μm, the optimized parameters for the nanogenerator were found to be a normal force of 137 nN and scanning speed of 40 μm/s, rather than the conventional settings of 120 nN for the normal force and 30 μm/s for the scanning speed. A nanogenerator with the optimized settings has three times the average output voltage of one with the conventional settings. © 2010 Tsinghua University Press and Springer-Verlag Berlin Heidelberg.

  10. Clustering statistics, roughness feedbacks, and randomness in experimental step-pool morphodynamics

    Science.gov (United States)

    Johnson, Joel P. L.

    2017-04-01

    Step pools are a common bed morphology in boulder-rich gravel streams, but predicting how mountainous landscapes will respond to environmental perturbations such as climate-related hydrological changes requires a better understanding of channel morphodynamics and factors that influence bed stability. Flume experiments exploring bed stabilization demonstrate feedbacks among surface roughness, coarse grain clustering, and surface grain size. Clustering is quantified by using a novel normalization of Ripley's K statistic designed for use in power law functions. At 95% confidence, many but not all beds stabilized with coarse grains becoming more clustered than complete spatial randomness. The clustering statistic predicts hydraulic roughness better than D84 does (the diameter at which 84% of grains are smaller), suggesting that the spatial organization of the bed can be a stronger control than grain size on flow hydraulics. Initial conditions affect the degree of clustering at stability, indicating sensitivity to history.

  11. Exploring the statistical and clinical impact of two interim analyses on the Phase II design with option for direct assignment.

    Science.gov (United States)

    An, Ming-Wen; Mandrekar, Sumithra J; Edelman, Martin J; Sargent, Daniel J

    2014-07-01

    The primary goal of Phase II clinical trials is to understand better a treatment's safety and efficacy to inform a Phase III go/no-go decision. Many Phase II designs have been proposed, incorporating randomization, interim analyses, adaptation, and patient selection. The Phase II design with an option for direct assignment (i.e. stop randomization and assign all patients to the experimental arm based on a single interim analysis (IA) at 50% accrual) was recently proposed [An et al., 2012]. We discuss this design in the context of existing designs, and extend it from a single-IA to a two-IA design. We compared the statistical properties and clinical relevance of the direct assignment design with two IA (DAD-2) versus a balanced randomized design with two IA (BRD-2) and a direct assignment design with one IA (DAD-1), over a range of response rate ratios (2.0-3.0). The DAD-2 has minimal loss in power (designs, the direct assignment design, especially with two IA, provides a middle ground with desirable statistical properties and likely appeal to both clinicians and patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Principles of Experimental Design for Big Data Analysis.

    Science.gov (United States)

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2017-08-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.

  13. Principles of Experimental Design for Big Data Analysis

    Science.gov (United States)

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2016-01-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis. PMID:28883686

  14. Nine-year change in statistical design, profile, and success rates of Phase II oncology trials.

    Science.gov (United States)

    Ivanova, Anastasia; Paul, Barry; Marchenko, Olga; Song, Guochen; Patel, Neerali; Moschos, Stergios J

    2016-01-01

    We investigated nine-year trends in statistical design and other features of Phase II oncology clinical trials published in 2005, 2010, and 2014 in five leading oncology journals: Cancer, Clinical Cancer Research, Journal of Clinical Oncology, Annals of Oncology, and Lancet Oncology. The features analyzed included cancer type, multicenter vs. single-institution, statistical design, primary endpoint, number of treatment arms, number of patients per treatment arm, whether or not statistical methods were well described, whether the drug was found effective based on rigorous statistical testing of the null hypothesis, and whether the drug was recommended for future studies.

  15. Development and experimental validation of a protein design software

    OpenAIRE

    Stiebritz, Martin

    2008-01-01

    Computational protein design - understood as the prediction of an amino acid sequence that will adopt a given 3-dimensional structure associated with a desired function - offers the promising perspective of tailoring proteins according to special needs. From a more fundamental point of view, it improves our understanding of protein architecture, folding and dynamics. Here, the development and experimental validation of a protein design software - MUMBO - is described. The current implementati...

  16. Neural Network Assisted Experimental Designs for Food Research

    Directory of Open Access Journals (Sweden)

    H.S. Ramaswamy

    2000-06-01

    Full Text Available The ability of artificial neural networks (ANN in predicting full factorial data from the fractional data corresponding to some of the commonly used experimental designs is explored in this paper. Factorial and fractional factorial designs such as L8, L9, L18, and Box and Behnken schemes were considered both in their original form and with some variations (L8+6, L15 and L9+1. Full factorial (3 factors x 5 levels and fractional data were generated employing sixteen different mathematical equations (four in each category: linear, with and without interactions, and non-linear, with and without interactions. Different ANN models were trained and the best model was chosen for each equation based on their ability to predict the fractional data. The best experimental design was then chosen based on their ability to simulate the full- factorial data for each equation. In several cases, the mean relative errors with the L18 design (which had more input data than other models were even higher than with other smaller fractional design. In general, the ANN assisted Lm, Box and Behnken, L15 and L18 designs were found to predict the full factorial data reasonably well with errors less than 5 %. The L8+6 model performed well with several experimental datasets reported in the literature.

  17. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  18. Experimental Design to Evaluate Directed Adaptive Mutation in Mammalian Cells

    Science.gov (United States)

    Chiaro, Christopher R; May, Tobias

    2014-01-01

    Background We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Objective Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. Methods An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. Results We performed the initial stages of characterizing our system

  19. Experimental design to evaluate directed adaptive mutation in Mammalian cells.

    Science.gov (United States)

    Bordonaro, Michael; Chiaro, Christopher R; May, Tobias

    2014-12-09

    We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. We performed the initial stages of characterizing our system and have limited preliminary data

  20. Squeeze-film damper design with air channels : Experimental verification

    NARCIS (Netherlands)

    Dias, R.A.; Wolffenbuttel, R.F.; Cretu, E.; Rocha, L.A.

    2011-01-01

    The experimental evaluation of damping-improved parallel-plate geometries is reported in this paper. An improved damper geometry with air channels was developed to address contradictory design constraints: large sensing parallel-plate area is desirable for a significant readout capacitance as well

  1. Experimental design of natural and accellerated bone and wood ageing

    DEFF Research Database (Denmark)

    Facorellis, Y.; Pournou, A.; Richter, Jane

    2015-01-01

    This paper presents the experimental design for natural and accelerated ageing of bone and wood samples found in museum conditions that was conceived as part of the INVENVORG (Thales Research Funding Program – NRSF) investigating the effects of the environmental factors on natural organic materials....

  2. Experimental and numerical analysis for optimal design parameters ...

    Indian Academy of Sciences (India)

    Later, the response surface curves are studied using ANOVA. Finally, the relations established are confirmed experimentally to validate the models. The relations thus established are beneficent in furtherance of designing evaporators. Additionally, the presentstudy is among the first attempts to reveal the effect of humidity ...

  3. Single-Subject Experimental Design for Evidence-Based Practice

    Science.gov (United States)

    Byiers, Breanne J.; Reichle, Joe; Symons, Frank J.

    2012-01-01

    Purpose: Single-subject experimental designs (SSEDs) represent an important tool in the development and implementation of evidence-based practice in communication sciences and disorders. The purpose of this article is to review the strategies and tactics of SSEDs and their application in speech-language pathology research. Method: The authors…

  4. Taguchi method of experimental design in materials education

    Science.gov (United States)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  5. Optimal Experimental Design for Large-Scale Bayesian Inverse Problems

    KAUST Repository

    Ghattas, Omar

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  6. Constructing experimental designs for discrete-choice experiments: report of the ISPOR Conjoint Analysis Experimental Design Good Research Practices Task Force.

    Science.gov (United States)

    Reed Johnson, F; Lancsar, Emily; Marshall, Deborah; Kilambi, Vikram; Mühlbacher, Axel; Regier, Dean A; Bresnahan, Brian W; Kanninen, Barbara; Bridges, John F P

    2013-01-01

    Stated-preference methods are a class of evaluation techniques for studying the preferences of patients and other stakeholders. While these methods span a variety of techniques, conjoint-analysis methods-and particularly discrete-choice experiments (DCEs)-have become the most frequently applied approach in health care in recent years. Experimental design is an important stage in the development of such methods, but establishing a consensus on standards is hampered by lack of understanding of available techniques and software. This report builds on the previous ISPOR Conjoint Analysis Task Force Report: Conjoint Analysis Applications in Health-A Checklist: A Report of the ISPOR Good Research Practices for Conjoint Analysis Task Force. This report aims to assist researchers specifically in evaluating alternative approaches to experimental design, a difficult and important element of successful DCEs. While this report does not endorse any specific approach, it does provide a guide for choosing an approach that is appropriate for a particular study. In particular, it provides an overview of the role of experimental designs for the successful implementation of the DCE approach in health care studies, and it provides researchers with an introduction to constructing experimental designs on the basis of study objectives and the statistical model researchers have selected for the study. The report outlines the theoretical requirements for designs that identify choice-model preference parameters and summarizes and compares a number of available approaches for constructing experimental designs. The task-force leadership group met via bimonthly teleconferences and in person at ISPOR meetings in the United States and Europe. An international group of experimental-design experts was consulted during this process to discuss existing approaches for experimental design and to review the task force's draft reports. In addition, ISPOR members contributed to developing a consensus

  7. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    Science.gov (United States)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  8. Design of Pump as Turbine Experimental Test Facility

    Directory of Open Access Journals (Sweden)

    Zariatin D. L.

    2017-01-01

    Full Text Available This paper presents the design process of experimental test facility for pump as turbine hydropower system. Three design possibilities that related to the PAT condition of operation was developed and analyzed by using CFD Software. It is found that the First Variant with a straight flow to the PAT will produce higher velocity, which is needed to generate more rotation of the shaft generator, in order to generate more electric power. The strength of PAT construction was analyzed by using FEM software. It was found that the maximum stress is 6 MPa and can be concluded that the construction is appropriate to the design requirement.

  9. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  10. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .1. DESIGN CONSTRUCTION AND THEORETICAL EVALUATION

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The combination of process variables and mixture variables in experimental design is a problem which has not yet been solved. It is examined here whether a set of designs can be found which can be used for a series of models of reasonable complexity. The proposed designs are compared with known

  11. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2017-03-01

    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  12. Experimental Study of the Nuclear Rotational Motion with Statistical Analysis Methods

    CERN Document Server

    Leoni, S; Frattini, S; Montingelli, G; Vigezzi, E; Døssing, T; Herskind, B; Matsuo, M

    1999-01-01

    The gamma-decay of excited rotating nuclei in the high level density region a few MeV above the yrast line has been studied through a statistical analysis of the fluctuation of counts in gamma-gamma coincident spectra. In particular, making use of the covariance technique between spectra gated by different intrinsic configurations, one can measure how the cascades feeding into different selected bands are similar. The aim is to learn about the transition between order to chaos in the nuclear many-body system, in terms of the validity of selection rules associated with the quantum numbers of the intrinsic structure. Experimental results on the nucleus sup 1 sup 6 sup 4 Yb are presented and compared to model predictions based on cranked shell model calculations including a two-body residual interaction.

  13. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Science.gov (United States)

    Malinowski, Paweł; Ziembicki, Piotr

    2017-03-01

    This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  14. Computational design and experimental validation of new thermal barrier systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin [Louisiana State Univ., Baton Rouge, LA (United States)

    2015-03-31

    The focus of this project is on the development of a reliable and efficient ab initio based computational high temperature material design method which can be used to assist the Thermal Barrier Coating (TBC) bond-coat and top-coat design. Experimental evaluations on the new TBCs are conducted to confirm the new TBCs’ properties. Southern University is the subcontractor on this project with a focus on the computational simulation method development. We have performed ab initio density functional theory (DFT) method and molecular dynamics simulation on screening the top coats and bond coats for gas turbine thermal barrier coating design and validation applications. For experimental validations, our focus is on the hot corrosion performance of different TBC systems. For example, for one of the top coatings studied, we examined the thermal stability of TaZr2.75O8 and confirmed it’s hot corrosion performance.

  15. Experimental research and statistic analysis of polymer composite adhesive joints strength

    Science.gov (United States)

    Rudawska, Anna; Miturska, Izabela; Szabelski, Jakub; Skoczylas, Agnieszka; Droździel, Paweł; Bociąga, Elżbieta; Madleňák, Radovan; Kasperek, Dariusz

    2017-05-01

    The aim of this paper is to determine the effect of arrangement of fibreglass fabric plies in a polymer composite on adhesive joint strength. Based on the experimental results, the real effect of plies arrangement and their most favourable configuration with respect to strength is determined. The experiments were performed on 3 types of composites which had different fibre orientations. The composites had three plies of fabric. The plies arrangement in Composite I was unchanged, in Composite II the central ply had the 45° orientation, while in Composite III the outside ply (tangential to the adhesive layer) was oriented at 45°. Composite plates were first cut into smaller specimens and then adhesive-bonded in different combinations with Epidian 61/Z1/100:10 epoxy adhesive. After stabilizing, the single-lap adhesive joints were subjected to shear strength tests. It was noted that plies arrangement in composite materials affects the strength of adhesive joints made of these composites between the values of the strength of the confidence level of 0.95. The statistical analysis of the results also showed that there are no statistical significant differences in average values of surface free energy (0.95 confidence level).

  16. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  17. Left-handedness is statistically linked to lifetime experimentation with illicit drugs.

    Science.gov (United States)

    Preti, Antonio; Usai, Ileana; Pintus, Elisa; Sardu, Cinzia; Petretto, Donatella Rita; Masala, Carmelo

    2012-01-01

    Handedness has been linked to an enhanced risk of alcohol abuse, while less is known about other drugs. A convenience sample of 1004 male and female Italian participants (females=58%) from the general community (18 to 65 years old: average age = 30; standard deviation = 10, median = 25) was asked about: handedness (preference in writing); lifetime use of alcohol, tobacco, and illicit drugs; levels of psychological distress, as measured by the General Health Questionnaire (GHQ); and levels of delusion proneness, as measured by the Peters et al. Delusions Inventory (PDI). Overall, 92 individuals (9.2%) were classified as left-handed, with no significant difference reported among genders. Lifetime use of illicit drugs, primarily cannabis, was reported by 20% of the sample. In a multiple logistic regression analysis, after taking into account sex, age, and caseness on GHQ and PDI, left-handed people in the sample were statistically more likely to report lifetime experimentation with heroin, ecstasy/amphetamine, and, marginally, hallucinogens, but not alcohol or tobacco. Different mechanisms might contribute to an explanation of greater lifetime experimentation with some illicit drugs among left-handed people as compared to right-handed people. However, replications with clinical samples are necessary before any definitive statements can be made.

  18. Experimental investigations and statistical analysis of pulsed laser bending of AISI 304 stainless steel sheet

    Science.gov (United States)

    Maji, Kuntal; Pratihar, D. K.; Nath, A. K.

    2013-07-01

    This paper presents experimental investigations on pulsed laser bending of sheet metal and statistical analysis to study the effects of process parameters. Laser power, scan speed, spot diameter and pulsed duration were taken as input variables and bending angle was considered as the output. Response surface methodology was used for modeling and optimization of the pulsed laser bending process. The performance of the developed model was validated through the experiments. All the input variables were found to have significant influence on the bending angle. Bending angle increased with the increase of laser power and pulse duration and decreased with the increase of scan speed and spot diameter. The optimum process parameters for the maximum bending angle were also found and verified with experimental data. The effects of pulse frequency, pulse width and pulse energy on bending angle were also investigated through experiments. Bending angle was found to be the maximum for a certain value of pulse frequency. With the increase of pulse width, bending angle increased at constant laser power but decreased at constant pulse energy. Bending angle was seen to increase with the increase of spatial overlapping and decrease with the increase of gap at constant laser power, but it showed optimal values for both the cases at constant line energy. A comparative study between continuous and pulsed laser bending was carried out to study the process efficiency in terms of energy input and produced deformation.

  19. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  20. Design and Implementation of an Experimental Segway Model

    Science.gov (United States)

    Younis, Wael; Abdelati, Mohammed

    2009-03-01

    The segway is the first transportation product to stand, balance, and move in the same way we do. It is a truly 21st-century idea. The aim of this research is to study the theory behind building segway vehicles based on the stabilization of an inverted pendulum. An experimental model has been designed and implemented through this study. The model has been tested for its balance by running a Proportional Derivative (PD) algorithm on a microprocessor chip. The model has been identified in order to serve as an educational experimental platform for segways.

  1. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  2. Optimization of Metal Removal Factors Using Experimental Design

    Directory of Open Access Journals (Sweden)

    Burcu Çağlar GENÇOSMAN

    2015-03-01

    Full Text Available Experimental design methodology has been used in various research areas including industrial and chemical engineering. In this paper, factor analysis and response surface optimization approaches were used for cadmium removal from aqueous solutions. The factors affecting removal of Cd ions from aqueous solutions were investigated depending on pH, initial metal concentration and solution temperature. Activated carbon used in the experiments was produced from Tunçbilek lignite by physical activation method. The analysis of important factors is established by using the design of experiments method. The effect and the interaction among the investigated factors are evaluated using the analysis of variance method. Together with regression analysis, response surface optimization is also utilized to obtain optimum conditions for best copper removal within the experimental limits.

  3. Multi-criteria optimization of the flesh melons skin separation process by experimental and statistical analysis methods

    Directory of Open Access Journals (Sweden)

    Y. B. Medvedkov

    2016-01-01

    Full Text Available Research and innovation activity to create energy-efficient processes in the melon processing, is a significant task. Separation skin from the melon flesh with their subsequent destination application in the creation of new food products is one of the time-consuming operations in this technology. Lack of scientific and experimental base of this operation holding back the development of high-performance machines for its implementation. In this connection, the technique of the experiment on the separation of the skins of melons in the pilot plant and the search for optimal regimes of its work methods by statistical modeling is offered. The late-ripening species of melon: Kalaysan, Thorlami, Gulab-sary are objects of study. Interaction of factors influencing on separating the melon skins process is carried out. A central composite rotatable design and fractional factorial experiment was used. Using the method of experimental design with treatment planning template in Design Expert v.10 software yielded a regression equations that adequately describe the actual process. Rational intervals input factors values are established: the ratio of the rotational speed of the drum to the abrasive supply roll rotational frequency; the gap between the supply drum and the shearing knife; shearing blade sharpening angle; the number of feed drum spikes; abrading drum orifices diameter. The mean square error does not exceed 12.4%. Regression equations graphic interpretation is presented by scatter plots and engineering nomograms that can be predictive of a choice of rational values of the input factors for three optimization criteria: minimal specific energy consumption in the process of cutting values, maximal specific performance by the pulp and pulp extraction ratio values. Obtained data can be used for the operational management of the process technological parameters, taking into account the geometrical dimensions of the melon and its inhomogeneous structure.

  4. TIBER: Tokamak Ignition/Burn Experimental Research. Final design report

    Energy Technology Data Exchange (ETDEWEB)

    Henning, C.D.; Logan, B.G.; Barr, W.L.; Bulmer, R.H.; Doggett, J.N.; Johnson, B.M.; Lee, J.D.; Hoard, R.W.; Miller, J.R.; Slack, D.S.

    1985-11-01

    The Tokamak Ignition/Burn Experimental Research (TIBER) device is the smallest superconductivity tokamak designed to date. In the design plasma shaping is used to achieve a high plasma beta. Neutron shielding is minimized to achieve the desired small device size, but the superconducting magnets must be shielded sufficiently to reduce the neutron heat load and the gamma-ray dose to various components of the device. Specifications of the plasma-shaping coil, the shielding, coaling, requirements, and heating modes are given. 61 refs., 92 figs., 30 tabs. (WRF)

  5. Model-Based Optimal Experimental Design for Complex Physical Systems

    Science.gov (United States)

    2015-12-03

    optimality aspect of sOED, another major difficulty remains: to numer - ically represent non-Gaussian, continuous random variable posteriors (i.e...construct, as they involve solving a convex optimization problem that can be easily separated into independent sub-problems for each dimension, and requires...OM Knio. “Bayesian Guided Optimal Experimental Design for Reactive Multilayers.” In preparation, 2015. 6. X Huan, “ Numerical Approaches for Sequential

  6. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  7. Statistical methods for genetic association studies with response-selective sampling designs

    NARCIS (Netherlands)

    Balliu, Brunilda

    2015-01-01

    This dissertation describes new statistical methods designed to improve the power of genetic association studies. Of particular interest are studies with a response-selective sampling design, i.e. case-control studies of unrelated individuals and case-control studies of family members. The

  8. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  9. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  10. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki

    2015-01-07

    Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.

  11. Experimental Investigations on Performance and Design Parameters of Solar Chimney

    Directory of Open Access Journals (Sweden)

    İbrahim ÜÇGÜL

    2010-03-01

    Full Text Available In this study, a solar chimney system, which is suitable for climate conditions of Isparta and its surroundings, is designed theoretically. With the aim of studying experimentally as based on that design, a prototype solar chimney has been constructed in the university campus area of Süleyman Demirel University-RACRER (Research and Application Center for Renewable Energy Resources. Additionally, after the experimental studies, the system is modelled theoretically with depending on the design. Then, this model constituted the basis for developed computer programme and performance parameters of the system are obtained. The obtained findings showed that the solar chimney, which is suitable for climate conditions of Isparta and its surroundings, are sufficient for determining design and performance parameters. The results showed that electricity generation with solar chimney is suitable for areas which have high solar incident and long sunshine duration and similar climate conditions as such as Isparta and its surroundings. When the results are evaluated, it is seen that electricity generation power of solar chimney depends on the region solar data, the chimney height and the size of greenhouse area.

  12. Experimental and statistical investigation of thermally induced failure in reactor fuel particles

    Energy Technology Data Exchange (ETDEWEB)

    Lunsford, J.L.; Imprescia, R.J.; Bowman, A.L.; Radosevich, C.E.

    1980-10-01

    An incomplete experimental study into the failure statistics of fuel particle for the high-temperature gas-cooled reactor (HTGR) is described. Fuel particles failure was induced by thermal ramping from room temperature to temperatures in the vicinity of 2273/sup 0/K to 2773/sup 0/K in 2 to 30 h and detected by the appearance of /sup 85/Kr in the helium carrier gas used to sweep the furnace. The concentration of krypton, a beta emitter, was detected by measuring the current that resulted when the helium sweep gas was passed through an ionization chamber. TRISO fuel particles gave a krypton concentration profile as a function of time that built up in several minutes and decayed in a fraction of an hour. This profile, which was temperature independent, was similar to the impulse response of the ionization chamber, suggesting that the TRISO particles failed instantaneously and completely. BISO fuel particles gave a krypton concentration profile as a function of time that built up in a fraction of an hour and decayed in a fraction of a day. This profile was strongly temperature dependent, suggesting that krypton release was diffusion controlled, i.e., that the krypton was diffusing through a sound coat, or that the BISO coating failed but that the krypton was unable to escape the kernel without diffusion, or that a combination of pre- and postfailure diffusion accompanied partial or complete failure.

  13. Flood volume estimation in Switzerland using synthetic design hydrographs - a multivariate statistical approach

    OpenAIRE

    Brunner, Manuela Irene; Vannier, Olivier; Favre, Anne-Catherine; Viviroli, Daniel; Meylan, Paul; Sikorska, Anna; Seibert, Jan

    2016-01-01

    Accurate estimations of flood peaks, volumes and hydrographs are needed to design safe and cost-effective hydraulic structures. In this study, we propose a statistical approach for the estimation of the design variables peak and volume by constructing a synthetic design hydrograph. Our approach is based on fitting probability density functions to observed flood hydrographs and takes the dependence between the two variables peak and volume into account. The method consists of the following six...

  14. Theoretical and experimental performance analysis for cold trap design

    Energy Technology Data Exchange (ETDEWEB)

    Hemanath, M.G., E-mail: hemanath@igcar.gov.i [Fast Reactor Technology Group, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Meikandamurthy, C.; Kumar, A. Ashok; Chandramouli, S.; Rajan, K.K.; Rajan, M.; Vaidyanathan, G.; Padmakumar, G.; Kalyanasundaram, P.; Raj, Baldev [Fast Reactor Technology Group, Indira Gandhi Center for Atomic Research, Kalpakkam (India)

    2010-10-15

    Cold trap is a purification unit used in sodium system of FBR's for maintaining the oxygen/hydrogen level in sodium within acceptable limits. It works on the principle of crystallization and precipitation of oxides/hydrides of sodium in a wire mesh, when the temperature of sodium is reduced below the saturation temperature. The cold traps presently used have lower effectiveness and get plugged prematurely. The plugged cold traps are cleaned and then put back into service. Frequent cleaning of cold trap results in the long down time of the sodium system. New design of cold trap has been conceived to overcome the above problems. The mathematical modeling for the new design was carried out and validated with experimentally tested results for its effectiveness. This paper shares the experience gained on the new design of cold trap for FBR's.

  15. Single-subject experimental design for evidence-based practice.

    Science.gov (United States)

    Byiers, Breanne J; Reichle, Joe; Symons, Frank J

    2012-11-01

    Single-subject experimental designs (SSEDs) represent an important tool in the development and implementation of evidence-based practice in communication sciences and disorders. The purpose of this article is to review the strategies and tactics of SSEDs and their application in speech-language pathology research. The authors discuss the requirements of each design, followed by advantages and disadvantages. The logic and methods for evaluating effects in SSED are reviewed as well as contemporary issues regarding data analysis with SSED data sets. Examples of challenges in executing SSEDs are included. Specific exemplars of how SSEDs have been used in speech-language pathology research are provided throughout. SSED studies provide a flexible alternative to traditional group designs in the development and identification of evidence-based practice in the field of communication sciences and disorders.

  16. [Again review of research design and statistical methods of Chinese Journal of Cardiology].

    Science.gov (United States)

    Kong, Qun-yu; Yu, Jin-ming; Jia, Gong-xian; Lin, Fan-li

    2012-11-01

    To re-evaluate and compare the research design and the use of statistical methods in Chinese Journal of Cardiology. Summary the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology all over the year of 2011, and compared the result with the evaluation of 2008. (1) There is no difference in the distribution of the design of researches of between the two volumes. Compared with the early volume, the use of survival regression and non-parameter test are increased, while decreased in the proportion of articles with no statistical analysis. (2) The proportions of articles in the later volume are significant lower than the former, such as 6(4%) with flaws in designs, 5(3%) with flaws in the expressions, 9(5%) with the incomplete of analysis. (3) The rate of correction of variance analysis has been increased, so as the multi-group comparisons and the test of normality. The error rate of usage has been decreased form 17% to 25% without significance in statistics due to the ignorance of the test of homogeneity of variance. Many improvements showed in Chinese Journal of Cardiology such as the regulation of the design and statistics. The homogeneity of variance should be paid more attention in the further application.

  17. Amplified energy harvester from footsteps: design, modeling, and experimental analysis

    Science.gov (United States)

    Wang, Ya; Chen, Wusi; Guzman, Plinio; Zuo, Lei

    2014-04-01

    This paper presents the design, modeling and experimental analysis of an amplified footstep energy harvester. With the unique design of amplified piezoelectric stack harvester the kinetic energy generated by footsteps can be effectively captured and converted into usable DC power that could potentially be used to power many electric devices, such as smart phones, sensors, monitoring cameras, etc. This doormat-like energy harvester can be used in crowded places such as train stations, malls, concerts, airport escalator/elevator/stairs entrances, or anywhere large group of people walk. The harvested energy provides an alternative renewable green power to replace power requirement from grids, which run on highly polluting and global-warming-inducing fossil fuels. In this paper, two modeling approaches are compared to calculate power output. The first method is derived from the single degree of freedom (SDOF) constitutive equations, and then a correction factor is applied onto the resulting electromechanically coupled equations of motion. The second approach is to derive the coupled equations of motion with Hamilton's principle and the constitutive equations, and then formulate it with the finite element method (FEM). Experimental testing results are presented to validate modeling approaches. Simulation results from both approaches agree very well with experimental results where percentage errors are 2.09% for FEM and 4.31% for SDOF.

  18. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Directory of Open Access Journals (Sweden)

    Patrick Wessa

    Full Text Available BACKGROUND: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses, which required us to develop a specific-purpose Statistical Learning Environment (SLE based on Reproducible Computing and newly developed Peer Review (PR technology. OBJECTIVES: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. METHODS: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. RESULTS: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student

  19. LOGICAL AND EXPERIMENTAL DESIGN FOR PHENOL DEGRADATION USING IMMOBILIZED ACINETOBACTER SP. CULTURE

    Directory of Open Access Journals (Sweden)

    Amro Abd Al Fattah Amara

    2010-05-01

    Full Text Available Phenol degradation processes were conducted through a series of enzymatic reactions effects and is affect by different components of the microbial metabolic flux. Using different optimization strategies like mutagenesis could lead to a successful optimization but also lead to lost of some important microbial features or to release a new virulence or unexpected characters. Plackett-Burman closes much gab between optimization, safety, time, cost, Man/hr, the complexity of the metabolic flux etc. Using Plackett-Burman experimental design lead to map the points affect in the optimization process by well understanding their request from nutrient and the best environmental condition required. In this study nine variables include pH (X1, oC (X2, glucose (X3, yeast extract (X4, meat extract (X5, NH4NO3 (X6, K-salt (X7, Mg-salt (X8 and trace element (X9 are optimized during phenol degradation by Acinetobacter sp., using Plackett-Burman design method. Plackett-Burman included 16 experiments, each was used in two levels, [-1] low and high [+1]. According to Blackett-Burman design experiments the maximum degradation rate was 31.25 mg/l/h. Logical and statistical analysis of the data lead to select pH, Temperature and Meat extract as three factors affecting on phenol degradation rate. These three variables have been used in Box-Behnken experimental design for further optimization. Meat extract, which is not statistically recommended for optimization has been used while it can substitute trace element, which is statistically significant. Glucose, which is statistically significant, did not included while it has a negative effect and gave the best result at 0 g/l amount. Glucose has been completely omitted from the media.  pH, temperature and meat extract were used in fifteen experiments each was used in three levels, –1, 0, and +1 according to Box-Behnken design. Microsoft Excel 2002 solver tool was used to optimize the model created from Box-Behnken. The

  20. Understanding the FDA guidance on adaptive designs: historical, legal, and statistical perspectives.

    Science.gov (United States)

    Liu, Qing; Chi, George Y H

    2010-11-01

    The recent Food and Drug Administration (FDA) guidance for industry on adaptive designs is perhaps one of the important undertakings by CDER/CBER Office of Biostatistics. Undoubtedly, adaptive designs may affect almost all phases of clinical development and impact nearly all aspects of clinical trial planning, execution and statistical inference. Thus, it is a significant accomplishment for the Office of Biostatistics to develop this well-thought-out and all-encompassing guidance document. In this paper, we discuss some critical topical issues of adaptive designs with supporting methodological work from either existing literature, additional technical notes, or accompanying papers. In particular, we provide numerous sources of design, conduct, analysis, and interpretation bias that arise from statistical procedures. We illustrate, as a result, and caution that substantial research is necessary for many adaptive designs to meet required scientific standards prior to their applications in clinical trials.

  1. Stimulated Brillouin scattering materials, experimental design and applications: A review

    Science.gov (United States)

    Bai, Zhenxu; Yuan, Hang; Liu, Zhaohong; Xu, Pengbai; Gao, Qilin; Williams, Robert J.; Kitzler, Ondrej; Mildren, Richard P.; Wang, Yulei; Lu, Zhiwei

    2018-01-01

    Stimulated Brillouin scattering (SBS), as one type of third-order nonlinear optics effect, is extensively exploited and rapidly developed in the field of lasers and optoelectronics. A large number of theoretical and experimental studies on SBS have been carried out in the past decades. Especially, the exploration of new SBS materials and new types of SBS modulation methods have been engaged simultaneously, as the properties of different materials have great influence on the SBS performance such as generation threshold, Brillouin amplification efficiency, frequency shift, breakdown threshold, etc. This article provides a comprehensive review of the characteristics of different types of SBS materials, SBS applications, experimental design methods, as well as the parameter optimization method, which is expected to provide reference and guidance to SBS related experiments.

  2. Computational Design and Experimental Validation of New Thermal Barrier Systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin; Yang, Shizhong; Khosravi, Ebrahim

    2014-04-01

    This project (10/01/2010-9/30/2014), “Computational Design and Experimental Validation of New Thermal Barrier Systems”, originates from Louisiana State University (LSU) Mechanical Engineering Department and Southern University (SU) Department of Computer Science. This project will directly support the technical goals specified in DE-FOA-0000248, Topic Area 3: Turbine Materials, by addressing key technologies needed to enable the development of advanced turbines and turbine-based systems that will operate safely and efficiently using coal-derived synthesis gases. In this project, the focus is to develop and implement novel molecular dynamics method to improve the efficiency of simulation on novel TBC materials; perform high performance computing (HPC) on complex TBC structures to screen the most promising TBC compositions; perform material characterizations and oxidation/corrosion tests; and demonstrate our new thermal barrier coating (TBC) systems experimentally under integrated gasification combined cycle (IGCC) environments.

  3. Computational Design and Experimental Validation of New Thermal Barrier Systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin; Yang, Shizhong; Khosravi, Ebrahim

    2012-10-01

    This project (10/01/2010-9/30/2013), “Computational Design and Experimental Validation of New Thermal Barrier Systems”, originates from Louisiana State University (LSU) Mechanical Engineering Department and Southern University (SU) Department of Computer Science. This project will directly support the technical goals specified in DEFOA- 0000248, Topic Area 3: Turbine Materials, by addressing key technologies needed to enable the development of advanced turbines and turbine-based systems that will operate safely and efficiently using coal-derived synthesis gases. We will develop and implement novel molecular dynamics method to improve the efficiency of simulation on novel TBC materials; perform high performance computing (HPC) on complex TBC structures to screen the most promising TBC compositions; perform material characterizations and oxidation/corrosion tests; and demonstrate our new thermal barrier coating (TBC) systems experimentally under integrated gasification combined cycle (IGCC) environments. The durability of the coating will be examined using the proposed Durability Test Rig.

  4. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective

  5. Longitudinal Assessment Design and Statistical Power for Detecting an Intervention Impact.

    Science.gov (United States)

    Petras, Hanno

    2016-10-01

    In evaluating randomized control trials (RCTs), statistical power analyses are necessary to choose a sample size which strikes the balance between an insufficient and an excessive design, with the latter leading to misspent resources. With the growing popularity of using longitudinal data to evaluate RCTs, statistical power calculations have become more complex. Specifically, with repeated measures, the number and frequency of measurements per person additionally influence statistical power by determining the precision with which intra-individual change can be measured as well as the reliability with which inter-individual differences in change can be assessed. The application of growth mixture models has shown that the impact of universal interventions is often concentrated among a small group of individuals at the highest level of risk. General sample size calculations were consequently not sufficient to determine whether statistical power is adequate to detect the desired effect. Currently, little guidance exists to recommend a sufficient assessment design to evaluating intervention impact. To this end, Monte Carlo simulations are conducted to assess the statistical power and precision when manipulating study duration and assessment frequency. Estimates were extracted from a published evaluation of the proximal of the Good Behavior Game (GBG) on the developmental course of aggressive behavior. Results indicated that the number of time points and the frequency of assessments influence statistical power and precision. Recommendations for the assessment design of longitudinal studies are discussed.

  6. Statistical design for recycling kaolin processing waste in the manufacturing of mullite-based ceramics

    Directory of Open Access Journals (Sweden)

    Romualdo Rodrigues Menezes

    2009-06-01

    Full Text Available Mineral extraction and processing industries have been cited as sources of environmental contamination and pollution. However, waste recycling represents an alternative recovery option, which is interesting from an environmental and economic standpoint. In this work, recycling of kaolin processing waste in the manufacture of mullite-based ceramics was investigated based on the statistical design of mixture experiments methodology. Ten formulations using kaolin processing waste, alumina and ball clay were used in the experiment design. Test specimens were fired and characterized to determine their water absorption and modulus of rupture. Regression models were calculated, relating the properties with the composition. The significance and validity of the models were confirmed through statistical analysis and verification experiments. The regression models were used to analyze the influence of waste content on the properties of the fired bodies. The results indicated that the statistical design of mixture experiments methodology can be successfully used to optimize formulations containing large amount of wastes.

  7. Theoretical and experimental investigations on fracture statistics carried out at the IDIEM (Chile

    Directory of Open Access Journals (Sweden)

    Kittl, P.

    1986-09-01

    Full Text Available The high exigencies required for some structures owing to their responsability or their high cost have originated a new discipline that can be called Reability Engineering, which main aim is to determine the probability afforded by a machine to comply a requirement. This work contains a not fully detailed description of the topics studied by IDEM during those last years, within this field. Within it there are Fracture Statistics, which studies the probability for some structure to undergo plastic deformations, and the probability of the causes to occur, taken into account materials fatigue. It also includes a theoretical development of the fracture statistics, describing the specific-risk-of-fracture functions by means of integral equations, and the determination of their parameters and their uncertainties, when the functions have a known analytical form. Experimental researches range from the most brittle bodies, such as glass, almost brittle ones such as cement paste and to others that can admit plastic deformation, such as certain weldings, enlarging the study to fibro-composites and natural materials such as granite.

    Las altas exigencias requeridas para algunas estructuras de especial responsabilidad o de muy alto coste han dado origen a una nueva disciplina que puede denominarse Ingeniería de la Fiabilidad, cuyo principal objetivo es determinar la probabilidad con que un ingenio puede verificar una exigencia. En este trabajo se presenta una descripción, no muy detallada, de los tópicos tratados por el IDIEM, en estos últimos años, dentro de esta disciplina. Dentro de ella está la Mecánica Estadística de Fractura, que estudia la probabilidad de que una estructura se deforme plásticamente, y la probabilidad de ocurrencia de las causas, teniendo en cuenta la fatiga de los materiales. Se incluye un desarrollo teórico de la mecánica estadística de fractura, describiendo las funciones de riesgo específico de

  8. Trials in primary care: statistical issues in the design, conduct and evaluation of complex interventions.

    Science.gov (United States)

    Lancaster, G A; Campbell, M J; Eldridge, S; Farrin, A; Marchant, M; Muller, S; Perera, R; Peters, T J; Prevost, A T; Rait, G

    2010-08-01

    Trials carried out in primary care typically involve complex interventions that require considerable planning if they are to be implemented successfully. The role of the statistician in promoting both robust study design and appropriate statistical analysis is an important contribution to a multi-disciplinary primary care research group. Issues in the design of complex interventions have been addressed in the Medical Research Council's new guidance document and over the past 7 years by the Royal Statistical Society's Primary Health Care Study Group. With the aim of raising the profile of statistics and building research capability in this area, particularly with respect to methodological issues, the study group meetings have covered a wide range of topics that have been of interest to statisticians and non-statisticians alike. The aim of this article is to provide an overview of the statistical issues that have arisen over the years related to the design and evaluation of trials in primary care, to provide useful examples and references for further study and ultimately to promote good practice in the conduct of complex interventions carried out in primary care and other health care settings. Throughout we have given particular emphasis to statistical issues related to the design of cluster randomised trials.

  9. Optimization of formulation variables of benzocaine liposomes using experimental design.

    Science.gov (United States)

    Mura, Paola; Capasso, Gaetano; Maestrelli, Francesca; Furlanetto, Sandra

    2008-01-01

    This study aimed to optimize, by means of an experimental design multivariate strategy, a liposomal formulation for topical delivery of the local anaesthetic agent benzocaine. The formulation variables for the vesicle lipid phase uses potassium glycyrrhizinate (KG) as an alternative to cholesterol and the addition of a cationic (stearylamine) or anionic (dicethylphosphate) surfactant (qualitative factors); the percents of ethanol and the total volume of the hydration phase (quantitative factors) were the variables for the hydrophilic phase. The combined influence of these factors on the considered responses (encapsulation efficiency (EE%) and percent drug permeated at 180 min (P%)) was evaluated by means of a D-optimal design strategy. Graphic analysis of the effects indicated that maximization of the selected responses requested opposite levels of the considered factors: For example, KG and stearylamine were better for increasing EE%, and cholesterol and dicethylphosphate for increasing P%. In the second step, the Doehlert design, applied for the response-surface study of the quantitative factors, pointed out a negative interaction between percent ethanol and volume of the hydration phase and allowed prediction of the best formulation for maximizing drug permeation rate. Experimental P% data of the optimized formulation were inside the confidence interval (P < 0.05) calculated around the predicted value of the response. This proved the suitability of the proposed approach for optimizing the composition of liposomal formulations and predicting the effects of formulation variables on the considered experimental response. Moreover, the optimized formulation enabled a significant improvement (P < 0.05) of the drug anaesthetic effect with respect to the starting reference liposomal formulation, thus demonstrating its actually better therapeutic effectiveness.

  10. Optimal experimental design to position transducers in ultrasound breast imaging

    Science.gov (United States)

    Korta Martiartu, Naiara; Boehm, Christian; Vinard, Nicolas; Jovanović Balic, Ivana; Fichtner, Andreas

    2017-03-01

    We present methods to optimize the setup of a 3D ultrasound tomography scanner for breast cancer detection. This approach provides a systematic and quantitative tool to evaluate different designs and to optimize the con- figuration with respect to predefined design parameters. We consider both, time-of-flight inversion using straight rays and time-domain waveform inversion governed by the acoustic wave equation for imaging the sound speed. In order to compare different designs, we measure their quality by extracting properties from the Hessian operator of the time-of-flight or waveform differences defined in the inverse problem, i.e., the second derivatives with respect to the sound speed. Spatial uncertainties and resolution can be related to the eigenvalues of the Hessian, which provide a good indication of the information contained in the data that is acquired with a given design. However, the complete spectrum is often prohibitively expensive to compute, thus suitable approximations have to be developed and analyzed. We use the trace of the Hessian operator as design criterion, which is equivalent to the sum of all eigenvalues and requires less computational effort. In addition, we suggest to take advantage of the spatial symmetry to extrapolate the 3D experimental design from a set of 2D configurations. In order to maximize the quality criterion, we use a genetic algorithm to explore the space of possible design configurations. Numerical results show that the proposed strategies are capable of improving an initial configuration with uniformly distributed transducers, clustering them around regions with poor illumination and improving the ray coverage of the domain of interest.

  11. Toxic responses to defined chemical mixtures: mathematical models and experimental designs.

    Science.gov (United States)

    Michaud, J P; Gandolfi, A J; Brendel, K

    1994-01-01

    The problem and relevance of assessing biological responses to chemical mixtures is presented with reference to the literature on this problem and its possible solutions. This review is intended for a general audience as an introduction to, and comment on, assessing the interactions of defined mixtures of xenobiotics. The focus is on experimental toxicology, however, the methods are also applicable to pharmacology. Much of the literature on this topic is quite specialized in statistics, theory, or specific applications. This may deter a significant portion of the growing number of investigators in this field from using this literature, and may partially account for the persistent use of methods which have been shown to permit precarious conclusions. References are given for some of the most comprehensive and recent work and reviews on the subject. The reader is given some familiarity with this topic's basic problems and ideas, and the controversy on terminology. One example is presented of a popular experimental design and data analysis method which while applicable in some situations, has been shown to lead to precarious and even erroneous conclusions. Eight other methods of data analysis are briefly presented and some of their advantages, disadvantages, assumptions, and limitations are discussed. These methods were selected to illustrate similarities and differences in the various approaches taken in addressing this problem. Three basic types of experimental design appropriate to these kinds of studies are outlined. General considerations, suggested guidelines, and possible pitfalls in experimental design, and data analysis of biological responses to chemical mixtures are discussed.

  12. Entropy-Based Experimental Design for Optimal Model Discrimination in the Geosciences

    Directory of Open Access Journals (Sweden)

    Wolfgang Nowak

    2016-11-01

    Full Text Available Choosing between competing models lies at the heart of scientific work, and is a frequent motivation for experimentation. Optimal experimental design (OD methods maximize the benefit of experiments towards a specified goal. We advance and demonstrate an OD approach to maximize the information gained towards model selection. We make use of so-called model choice indicators, which are random variables with an expected value equal to Bayesian model weights. Their uncertainty can be measured with Shannon entropy. Since the experimental data are still random variables in the planning phase of an experiment, we use mutual information (the expected reduction in Shannon entropy to quantify the information gained from a proposed experimental design. For implementation, we use the Preposterior Data Impact Assessor framework (PreDIA, because it is free of the lower-order approximations of mutual information often found in the geosciences. In comparison to other studies in statistics, our framework is not restricted to sequential design or to discrete-valued data, and it can handle measurement errors. As an application example, we optimize an experiment about the transport of contaminants in clay, featuring the problem of choosing between competing isotherms to describe sorption. We compare the results of optimizing towards maximum model discrimination with an alternative OD approach that minimizes the overall predictive uncertainty under model choice uncertainty.

  13. Designing Awe in Virtual Reality: An Experimental Study

    Directory of Open Access Journals (Sweden)

    Alice Chirico

    2018-01-01

    Full Text Available Awe is a little-studied emotion with a great transformative potential. Therefore, the interest toward the study of awe’s underlying mechanisms has been increased. Specifically, researchers have been interested in how to reproduce intense feelings of awe within laboratory conditions. It has been proposed that the use of virtual reality (VR could be an effective way to induce awe in controlled experimental settings, thanks to its ability of providing participants with a sense of “presence,” that is, the subjective feeling of being displaced in another physical or imaginary place. However, the potential of VR as awe-inducing medium has not been fully tested yet. In the present study, we provided an evidence-based design and a validation of four immersive virtual environments (VEs involving 36 participants in a within-subject design. Of these, three VEs were designed to induce awe, whereas the fourth VE was targeted as an emotionally neutral stimulus. Participants self-reported the extent to which they felt awe, general affect and sense of presence related to each environment. As expected, results showed that awe-VEs could induce significantly higher levels of awe and presence as compared to the neutral VE. Furthermore, these VEs induced significantly more positive than negative affect. These findings supported the potential of immersive VR for inducing awe and provide useful indications for the design of awe-inspiring virtual environments.

  14. Design of Experimental Suspended Footbridge with Deck Made of UHPC

    Directory of Open Access Journals (Sweden)

    Blank Marek

    2016-01-01

    Full Text Available This paper deals with the static and dynamic design of experimental footbridge for pedestrians and cyclists in the municipality Lužec nad Vltavou in Czech Republic, Europe. This work aims to familiarize the reader with calculations carried out and the results obtained, describing the static and dynamic properties of proposed footbridge. The construction of footbridge is designed as a suspended structure with prestressed bridge deck consisting of prefabricated UHPC panels and reversed “V” shaped steel pylon with height of approximately 40 meters. The deck is anchored using 24 steel hangers in one row in a steel pylon - 17 ropes in the main span and 7 cables on the other side. Range of the main span is 99.18 meters and the secondary span is 31.9 m. Deck width is 4.5 meters with 3.0 meters passing space. The bridge is designed for the possibility of passage of vehicles weighting up to 3.5 tons. Deck panels are made of UHPC with reinforcement. At the edge of the bridge on the side of the shorter span the bridge deck is firmly connected with abutment and on the other deck it is stored using a pair of sliding bearings. The utilization of the excellent properties of UHPC allows to design a very thin and lightweight construction of the deck, which could not be achieved with the use of normal concrete.

  15. Experimental Design for Hanford Low-Activity Waste Glasses with High Waste Loading

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crum, Jarrod V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-24

    This report discusses the development of an experimental design for the initial phase of the Hanford low-activity waste (LAW) enhanced glass study. This report is based on a manuscript written for an applied statistics journal. Appendices A, B, and E include additional information relevant to the LAW enhanced glass experimental design that is not included in the journal manuscript. The glass composition experimental region is defined by single-component constraints (SCCs), linear multiple-component constraints (MCCs), and a nonlinear MCC involving 15 LAW glass components. Traditional methods and software for designing constrained mixture experiments with SCCs and linear MCCs are not directly applicable because of the nonlinear MCC. A modification of existing methodology to account for the nonlinear MCC was developed and is described in this report. One of the glass components, SO3, has a solubility limit in glass that depends on the composition of the balance of the glass. A goal was to design the experiment so that SO3 would not exceed its predicted solubility limit for any of the experimental glasses. The SO3 solubility limit had previously been modeled by a partial quadratic mixture model expressed in the relative proportions of the 14 other components. The partial quadratic mixture model was used to construct a nonlinear MCC in terms of all 15 components. In addition, there were SCCs and linear MCCs. This report describes how a layered design was generated to (i) account for the SCCs, linear MCCs, and nonlinear MCC and (ii) meet the goals of the study. A layered design consists of points on an outer layer, and inner layer, and a center point. There were 18 outer-layer glasses chosen using optimal experimental design software to augment 147 existing glass compositions that were within the LAW glass composition experimental region. Then 13 inner-layer glasses were chosen with the software to augment the existing and outer

  16. An Experimental Design of Bypass Magneto-Rheological (MR) damper

    Science.gov (United States)

    Rashid, MM; Aziz, Mohammad Abdul; Raisuddin Khan, Md.

    2017-11-01

    The magnetorheological (MR) fluid bypass damper fluid flow through a bypass by utilizing an external channel which allows the controllability of MR fluid in the channel. The Bypass MR damper (BMRD) contains a rectangular bypass flow channel, current controlled movable piston shaft arrangement and MR fluid. The static piston coil case is winding by a coil which is used inside the piston head arrangement. The current controlled coil case provides a magnetic flux through the BMRD cylinder for controllability. The high strength of alloy steel materials are used for making piston shaft which allows magnetic flux propagation throughout the BMRD cylinder. Using the above design materials, a Bypass MR damper is designed and tested. An excitation of current is applied during the experiment which characterizes the BMRD controllability. It is shown that the BMRD with external flow channel allows a high controllable damping force using an excitation current. The experimental result of damping force-displacement characteristics with current excitation and without current excitation are compared in this research. The BMRD model is validated by the experimental result at various frequencies and applied excitation current.

  17. Experimental Vertical Stability Studies for ITER Performance and Design Guidance

    Energy Technology Data Exchange (ETDEWEB)

    Humphreys, D A; Casper, T A; Eidietis, N; Ferrera, M; Gates, D A; Hutchinson, I H; Jackson, G L; Kolemen, E; Leuer, J A; Lister, J; LoDestro, L L; Meyer, W H; Pearlstein, L D; Sartori, F; Walker, M L; Welander, A S; Wolfe, S M

    2008-10-13

    Operating experimental devices have provided key inputs to the design process for ITER axisymmetric control. In particular, experiments have quantified controllability and robustness requirements in the presence of realistic noise and disturbance environments, which are difficult or impossible to characterize with modeling and simulation alone. This kind of information is particularly critical for ITER vertical control, which poses some of the highest demands on poloidal field system performance, since the consequences of loss of vertical control can be very severe. The present work describes results of multi-machine studies performed under a joint ITPA experiment on fundamental vertical control performance and controllability limits. We present experimental results from Alcator C-Mod, DIII-D, NSTX, TCV, and JET, along with analysis of these data to provide vertical control performance guidance to ITER. Useful metrics to quantify this control performance include the stability margin and maximum controllable vertical displacement. Theoretical analysis of the maximum controllable vertical displacement suggests effective approaches to improving performance in terms of this metric, with implications for ITER design modifications. Typical levels of noise in the vertical position measurement which can challenge the vertical control loop are assessed and analyzed.

  18. Sparse linear models: Variational approximate inference and Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, Matthias W [Saarland University and Max Planck Institute for Informatics, Campus E1.4, 66123 Saarbruecken (Germany)

    2009-12-01

    A wide range of problems such as signal reconstruction, denoising, source separation, feature selection, and graphical model search are addressed today by posterior maximization for linear models with sparsity-favouring prior distributions. The Bayesian posterior contains useful information far beyond its mode, which can be used to drive methods for sampling optimization (active learning), feature relevance ranking, or hyperparameter estimation, if only this representation of uncertainty can be approximated in a tractable manner. In this paper, we review recent results for variational sparse inference, and show that they share underlying computational primitives. We discuss how sampling optimization can be implemented as sequential Bayesian experimental design. While there has been tremendous recent activity to develop sparse estimation, little attendance has been given to sparse approximate inference. In this paper, we argue that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms. Moreover, since some variational inference methods have been given strong convex optimization characterizations recently, theoretical analysis may become possible, promising new insights into nonlinear experimental design.

  19. Common statistical and research design problems in manuscripts submitted to high-impact medical journals.

    Science.gov (United States)

    Fernandes-Taylor, Sara; Hyun, Jenny K; Reeder, Rachelle N; Harris, Alex Hs

    2011-08-19

    To assist educators and researchers in improving the quality of medical research, we surveyed the editors and statistical reviewers of high-impact medical journals to ascertain the most frequent and critical statistical errors in submitted manuscripts. The Editors-in-Chief and statistical reviewers of the 38 medical journals with the highest impact factor in the 2007 Science Journal Citation Report and the 2007 Social Science Journal Citation Report were invited to complete an online survey about the statistical and design problems they most frequently found in manuscripts. Content analysis of the responses identified major issues. Editors and statistical reviewers (n = 25) from 20 journals responded. Respondents described problems that we classified into two, broad themes: A. statistical and sampling issues and B. inadequate reporting clarity or completeness. Problems included in the first theme were (1) inappropriate or incomplete analysis, including violations of model assumptions and analysis errors, (2) uninformed use of propensity scores, (3) failing to account for clustering in data analysis, (4) improperly addressing missing data, and (5) power/sample size concerns. Issues subsumed under the second theme were (1) Inadequate description of the methods and analysis and (2) Misstatement of results, including undue emphasis on p-values and incorrect inferences and interpretations. The scientific quality of submitted manuscripts would increase if researchers addressed these common design, analytical, and reporting issues. Improving the application and presentation of quantitative methods in scholarly manuscripts is essential to advancing medical research.

  20. A Statistical Framework for Single Subject Design with an Application in Post-stroke Rehabilitation

    OpenAIRE

    Lu, Ying; Scott, Marc; Raghavan, Preeti

    2016-01-01

    This paper proposes a practical yet novel solution to a longstanding statistical testing problem regarding single subject design. In particular, we aim to resolve an important clinical question: does a new patient behave the same as one from a healthy population? This question cannot be answered using the traditional single subject design when only test subject information is used, nor can it be satisfactorily resolved by comparing a single-subject's data with the mean value of a healthy popu...

  1. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-08

    In this study, we optimize the experimental setup computationally by optimal experimental design (OED) in a Bayesian framework. We approximate the posterior probability density functions (pdf) using truncated Gaussian distributions in order to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate, and the covariance is chosen as the negative inverse of the Hessian of the misfit function at the MAP estimate. The model related entities are obtained from a polynomial surrogate. The optimality, quantified by the information gain measures, can be estimated efficiently by a rejection sampling algorithm against the underlying Gaussian probability distribution, rather than against the true posterior. This approach offers a significant error reduction when the magnitude of the invariants of the posterior covariance are comparable to the size of the bounded domain of the prior. We demonstrate the accuracy and superior computational efficiency of our method for shock-tube experiments aiming to measure the model parameters of a key reaction which is part of the complex kinetic network describing the hydrocarbon oxidation. In the experiments, the initial temperature and fuel concentration are optimized with respect to the expected information gain in the estimation of the parameters of the target reaction rate. We show that the expected information gain surface can change its shape dramatically according to the level of noise introduced into the synthetic data. The information that can be extracted from the data saturates as a logarithmic function of the number of experiments, and few experiments are needed when they are conducted at the optimal experimental design conditions.

  2. Experimental Design for the INL Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Piepel, Gregory F.; Matzke, Brett D.; Filliben, James J.; Jones, Barbara

    2007-12-13

    This document describes the test events and numbers of samples comprising the experimental design that was developed for the contamination, decontamination, and sampling of a building at the Idaho National Laboratory (INL). This study is referred to as the INL Sample Collection Operational Test. Specific objectives were developed to guide the construction of the experimental design. The main objective is to assess the relative abilities of judgmental and probabilistic sampling strategies to detect contamination in individual rooms or on a whole floor of the INL building. A second objective is to assess the use of probabilistic and Bayesian (judgmental + probabilistic) sampling strategies to make clearance statements of the form “X% confidence that at least Y% of a room (or floor of the building) is not contaminated. The experimental design described in this report includes five test events. The test events (i) vary the floor of the building on which the contaminant will be released, (ii) provide for varying or adjusting the concentration of contaminant released to obtain the ideal concentration gradient across a floor of the building, and (iii) investigate overt as well as covert release of contaminants. The ideal contaminant gradient would have high concentrations of contaminant in rooms near the release point, with concentrations decreasing to zero in rooms at the opposite end of the building floor. For each of the five test events, the specified floor of the INL building will be contaminated with BG, a stand-in for Bacillus anthracis. The BG contaminant will be disseminated from a point-release device located in the room specified in the experimental design for each test event. Then judgmental and probabilistic samples will be collected according to the pre-specified sampling plan. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples will be selected in sufficient numbers to provide desired confidence

  3. Innovations in curriculum design: a multi-disciplinary approach to teaching statistics to undergraduate medical students.

    Science.gov (United States)

    Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J

    2008-05-01

    Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students.

  4. Methods of learning in statistical education: Design and analysis of a randomized trial

    Science.gov (United States)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus

  5. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  6. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  7. Two-Stage Experimental Design for Dose-Response Modeling in Toxicology Studies.

    Science.gov (United States)

    Wang, Kai; Yang, Feng; Porter, Dale W; Wu, Nianqiang

    The efficient design of experiments (i.e., selection of experimental doses and allocation of animals) is important to establishing dose-response relationships in toxicology studies. The proposed procedure for design of experiments is distinct from those in the literature because it is able to adequately accommodate the special features of the dose-response data, which include non-normality, variance heterogeneity, possibly nonlinearity of the dose-response curve, and data scarcity. The design procedure is built in a sequential two-stage paradigm that allows for a learning process. In the first stage, preliminary experiments are performed to gain information regarding the underlying dose-response curve and variance structure. In the second stage, the prior information obtained from the previous stage is utilized to guide the second-stage experiments. An optimization algorithm is developed to search for the design of experiments that will lead to dose-response models of the highest quality. To evaluate model quality (or uncertainty), which is the basis of design optimization, a bootstrapping method is employed; unlike standard statistical methods, bootstrapping is not subject to restrictive assumptions such as normality or large sample sizes. The design procedure in this paper will help to reduce the experimental cost/time in toxicology studies and alleviate the sustainability concerns regarding the tremendous new materials and chemicals.

  8. Experimental Design of Electrocoagulation and Magnetic Technology for Enhancing Suspended Solids Removal from Synthetic Wastewater

    Directory of Open Access Journals (Sweden)

    Moh Faiqun Ni'am

    2014-10-01

    Full Text Available Design of experiments (DOE is one of the statistical method that is used as a tool to enhance and improve experimental quality. The changes to the variables of a process or system is supposed to give the optimal result (response and quite satisfactory. Experimental design can defined as a test or series of test series by varying the input variables (factors of a process that can known to cause changes in output (response. This paper presents the results of experimental design of wastewater treatment by electrocoagulation (EC technique. A combined magnet and electrocoagulation (EC technology were designed to increase settling velocity and to enhance suspended solid removal efficiencies from wastewater samples. In this experiment, a synthetic wastewater samples were prepared by mixing 700 mg of the milk powder in one litre of water and treated by using an acidic buffer solution. The monopolar iron (Fe plate anodes and cathodes were employed as electrodes. Direct current was varied in a range of between 0.5 and 1.1 A, and flowrate in a range of between 1.00 to 3.50 mL/s. One permanent magnets namely AlNiCo with a magnetic strength of 0.16T was used in this experiment. The results show that the magnetic field and the flowrate have major influences on suspended solids removal. The efficiency removals of suspended solids, turbidity and COD removal efficiencies at optimum conditions were found to be more than 85%, 95%, and 75%, respectively.

  9. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these

  10. Experimental and Statistical Evaluation of the Size Effect on the Bending Strength of Dimension Lumber of Northeast China Larch

    Directory of Open Access Journals (Sweden)

    Yong Zhong

    2016-01-01

    Full Text Available This study investigated the size effect on the bending strength (modulus of rupture—MOR of dimension lumber of Northeast China larch (Larix gmelinii; providing a basis for the further application in light wood frame construction. Experimental and statistical evaluations were conducted on the bending strength. A total of 2409 full-size dimension lumber samples were tested by static bending tests; which included three different sizes: 2 × 3; 2 × 4; and 2 × 6. Results indicate that the size has a significant effect on the MOR. Both the chi-square (χ2 and Kolmogorov-Smirnov (K-S test results show that the lognormal distribution generally fits to the MOR better than to the normal distribution. Additionally; the effects of partial safety factor (γR and live-to-dead load ratio (ρ were studied by reliability analysis. Reliability analysis results indicate that the reliability index increases nonlinearly with the decrease of γR and the rise of ρ. Finally; the design value of bending strength and its adjusting factor of size effect of 2 × 3; 2 × 4; and 2 × 6 larch dimension lumber were obtained according to the Chinese National Standards’ requirements of the reliability index.

  11. Research design and statistical methods in Indian medical journals: a retrospective survey.

    Science.gov (United States)

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, pdesign decreased significantly (χ2=16.783, Φ=0.12 pdesigns has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, pdesigns have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.

  12. New synthetic thrombin inhibitors: molecular design and experimental verification.

    Directory of Open Access Journals (Sweden)

    Elena I Sinauridze

    Full Text Available BACKGROUND: The development of new anticoagulants is an important goal for the improvement of thromboses treatments. OBJECTIVES: The design, synthesis and experimental testing of new safe and effective small molecule direct thrombin inhibitors for intravenous administration. METHODS: Computer-aided molecular design of new thrombin inhibitors was performed using our original docking program SOL, which is based on the genetic algorithm of global energy minimization in the framework of a Merck Molecular Force Field. This program takes into account the effects of solvent. The designed molecules with the best scoring functions (calculated binding energies were synthesized and their thrombin inhibitory activity evaluated experimentally in vitro using a chromogenic substrate in a buffer system and using a thrombin generation test in isolated plasma and in vivo using the newly developed model of hemodilution-induced hypercoagulation in rats. The acute toxicities of the most promising new thrombin inhibitors were evaluated in mice, and their stabilities in aqueous solutions were measured. RESULTS: New compounds that are both effective direct thrombin inhibitors (the best K(I was 1111.1 mg/kg. A plasma-substituting solution supplemented with one of the new inhibitors prevented hypercoagulation in the rat model of hemodilution-induced hypercoagulation. Activities of the best new inhibitors in physiological saline (1 µM solutions were stable after sterilization by autoclaving, and the inhibitors remained stable at long-term storage over more than 1.5 years at room temperature and at 4°C. CONCLUSIONS: The high efficacy, stability and low acute toxicity reveal that the inhibitors that were developed may be promising for potential medical applications.

  13. A statistical approach for optimization of polyhydroxybutyrate production by Bacillus sphaericus NCIM 5149 under submerged fermentation using central composite design.

    Science.gov (United States)

    Ramadas, Nisha V; Soccol, Carlos R; Pandey, Ashok

    2010-10-01

    The aim of this work was to statistically optimize the cultural and nutritional parameters for the production of polyhydroxybutyrate (PHB) under submerged fermentation using jackfruit seed hydrolysate as the sole carbon source. On the basis of results obtained from "one variable at a time" experiment, inoculum age, jackfruit seed hydrolysate concentration, and pH were selected for response surface methodology studies. A central composite design (CCD) was employed to get the optimum level of these three factors to maximize the PHB production. The CCD results predicted that jackfruit seed hydrolysates containing 2.5% reducing sugar, inoculum age of 18 h, and initial medium pH 6 could enhance the production of PHB to reach 49% of the biomass (biomass 4.5 g/l and PHB concentration 2.2 g/l). Analysis of variance exhibited a high coefficient of determination (R(2)) value of 0.910 and 0.928 for biomass and PHB concentration, respectively, and ensured that the quadratic model with the experimental data was a satisfactory one. This is the first report on PHB production by Bacillus sphaericus using statistical experimental design and RSM in submerged fermentation with jackfruit seed hydrolysate as the sole source of carbon.

  14. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  15. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  16. Protein design algorithms predict viable resistance to an experimental antifolate.

    Science.gov (United States)

    Reeve, Stephanie M; Gainza, Pablo; Frey, Kathleen M; Georgiev, Ivelin; Donald, Bruce R; Anderson, Amy C

    2015-01-20

    Methods to accurately predict potential drug target mutations in response to early-stage leads could drive the design of more resilient first generation drug candidates. In this study, a structure-based protein design algorithm (K* in the OSPREY suite) was used to prospectively identify single-nucleotide polymorphisms that confer resistance to an experimental inhibitor effective against dihydrofolate reductase (DHFR) from Staphylococcus aureus. Four of the top-ranked mutations in DHFR were found to be catalytically competent and resistant to the inhibitor. Selection of resistant bacteria in vitro reveals that two of the predicted mutations arise in the background of a compensatory mutation. Using enzyme kinetics, microbiology, and crystal structures of the complexes, we determined the fitness of the mutant enzymes and strains, the structural basis of resistance, and the compensatory relationship of the mutations. To our knowledge, this work illustrates the first application of protein design algorithms to prospectively predict viable resistance mutations that arise in bacteria under antibiotic pressure.

  17. Statistical controversies in clinical research: requiem for the 3 + 3 design for phase I trials.

    Science.gov (United States)

    Paoletti, X; Ezzalfani, M; Le Tourneau, C

    2015-09-01

    More than 95% of published phase I trials have used the 3 + 3 design to identify the dose to be recommended for phase II trials. However, the statistical community agrees on the limitations of the 3 + 3 design compared with model-based approaches. Moreover, the mechanisms of action of targeted agents strongly challenge the hypothesis that the maximum tolerated dose constitutes the optimal dose, and more outcomes including clinical and biological activity increasingly need to be taken into account to identify the optimal dose. We review key elements from clinical publications and from the statistical literature to show that the 3 + 3 design lacks the necessary flexibility to address the challenges of targeted agents. The design issues raised by expansion cohorts, new definitions of dose-limiting toxicity and trials of combinations are not easily addressed by the 3 + 3 design or its extensions. Alternative statistical proposals have been developed to make a better use of the complex data generated by phase I trials. Their applications require a close collaboration between all actors of early phase clinical trials. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets...... with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved...... to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods...

  19. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    Science.gov (United States)

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  20. Bayesian Optimal Experimental Design Using Multilevel Monte Carlo

    KAUST Repository

    Ben Issaid, Chaouki

    2015-05-12

    Experimental design can be vital when experiments are resource-exhaustive and time-consuming. In this work, we carry out experimental design in the Bayesian framework. To measure the amount of information that can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data about the model parameters. One of the major difficulties in evaluating the expected information gain is that it naturally involves nested integration over a possibly high dimensional domain. We use the Multilevel Monte Carlo (MLMC) method to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, MLMC can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the MLMC method imposes fewer assumptions, such as the asymptotic concentration of posterior measures, required for instance by the Laplace approximation (LA). We test the MLMC method using two numerical examples. The first example is the design of sensor deployment for a Darcy flow problem governed by a one-dimensional Poisson equation. We place the sensors in the locations where the pressure is measured, and we model the conductivity field as a piecewise constant random vector with two parameters. The second one is chemical Enhanced Oil Recovery (EOR) core flooding experiment assuming homogeneous permeability. We measure the cumulative oil recovery, from a horizontal core flooded by water, surfactant and polymer, for different injection rates. The model parameters consist of the endpoint relative permeabilities, the residual saturations and the relative permeability exponents for the three phases: water, oil and

  1. EXPERIMENTAL RESEARCH REGARDING LEATHER APPLICATIONS IN PRODUCT DESIGN

    Directory of Open Access Journals (Sweden)

    PRALEA Jeni

    2015-05-01

    Full Text Available This paper presents the role and importance of experimental research in design activity. The designer, as a researcher and a project manager, proposes to establish a relationship between functional-aesthetic-constructive-technological-economic,based on the aesthetic possibilities of the materials used for the experiments. With the aim to identify areas for the application of leather waste resulted from the production process, the paper presents experiments conducted with this material in combination with wood, by using different techniques that lead to different aesthetic effects. Identifying the areas to use and creating products from leather and/or wood waste, is based on the properties of these materials. Leather, the subject of these experiments, has the advantage that it can be used on both sides. Tactile differences of the two sides of this material has both aesthetical and functional advantages, which makes it suitable for applications on products that meet the requirements of "design for all". With differentiated tactile characteristics, in combination with other materials, for these experiments wood, easily "read touch" products can be generated to help people with certain disabilities. Thus, experiments presented in this paper allows the establishment of aesthetic schemes applicable to products that are friendly both with the environment (based on the reuse of wood and leather waste and with the users (can be used as applications, accessories and concepts of products for people with certain disabilities. The designer’s choices or decisions can be based on the results of this experiment. The experiment enables the designer to develop creative, innovative and environmentally friendly products.

  2. Experimental studies of tuned particle damper: Design and characterization

    Science.gov (United States)

    Zhang, Kai; Xi, Yanhui; Chen, Tianning; Ma, Zhihao

    2018-01-01

    To better suppress the structural vibration in the micro vibration and harsh environment, a new type of damper, tuned particle damper (TPD), was designed by combining the advantage of classical dynamic vibration absorber (DVA) and particle damper (PD). An equivalent theoretical model was established to describe the dynamic behavior of a cantilever system treated with TPD. By means of a series of sine sweep tests, the dynamic characteristic of TPD under different excitation intensity was explored and the damping performance of TPD was investigated by comparing with classical DVA and PD with the same mass ratio. Experimental results show that with the increasing of excitation intensity TPD shows two different dynamic characteristics successively, i.e., PD-like and DVA-like. TPD shows a wider suppression frequency band than classical DVA and better practicability than PD in the micro vibration environment. Moreover, to characterize the dynamic characteristic of TPD, a simple evaluation of the equivalent dynamic mass and equivalent dynamic damping of the cantilever system treated with TPD was performed by fitting the experimental data to the presented theoretical model. Finally, based on the rheology behaviors of damping particles reported by the previous research results, an approximate phase diagram which shows the motion states of damping particles in TPD was employed to analyze the dynamic characteristic of TPD and several motion states of damping particles in TPD were presented via a high-speed camera.

  3. Computational Design and Experimental Validation of New Thermal Barrier Systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin; Yang, Shizhong; Khosravi, Ebrahim

    2011-12-31

    This project (10/01/2010-9/30/2013), “Computational Design and Experimental Validation of New Thermal Barrier Systems”, originates from Louisiana State University (LSU) Mechanical Engineering Department and Southern University (SU) Department of Computer Science. This proposal will directly support the technical goals specified in DE-FOA-0000248, Topic Area 3: Turbine Materials, by addressing key technologies needed to enable the development of advanced turbines and turbine-based systems that will operate safely and efficiently using coal-derived synthesis gases. We will develop novel molecular dynamics method to improve the efficiency of simulation on novel TBC materials; we will perform high performance computing (HPC) on complex TBC structures to screen the most promising TBC compositions; we will perform material characterizations and oxidation/corrosion tests; and we will demonstrate our new Thermal barrier coating (TBC) systems experimentally under Integrated gasification combined cycle (IGCC) environments. The durability of the coating will be examined using the proposed High Temperature/High Pressure Durability Test Rig under real syngas product compositions.

  4. Experimental Verification of Current Shear Design Equations for HSRC Beams

    Directory of Open Access Journals (Sweden)

    Attaullah Shah

    2012-07-01

    Full Text Available Experimental research on the shear capacity of HSRC (High Strength Reinforced Concrete beams is relatively very limited as compared to the NSRC (Normal Strength Reinforced Concrete beams. Most of the Building Codes determine the shear strength of HSRC with the help of empirical equations based on experimental work of NSRC beams and hence these equations are generally regarded as un-conservative for HSRC beams particularly at low level of longitudinal reinforcement. In this paper, 42 beams have been tested in two sets, such that in 21 beams no transverse reinforcement has been used, whereas in the remaining 21 beams, minimum transverse reinforcement has been used as per ACI-318 (American Concrete Institute provisions. Two values of compressive strength 52 and 61 MPa, three values of longitudinal steel ratio and seven values of shear span to depth ratio have been have been used. The beams were tested under concentrated load at the mid span. The results are compared with the equations proposed by different international building codes like ACI, AASHTO LRFD, EC (Euro Code, Canadian Code and Japanese Code for shear strength of HSRC beams.From comparison, it has been observed that some codes are less conservative for shear design of HSRC beams and further research is required to rationalize these equations.

  5. Innovations in curriculum design: A multi-disciplinary approach to teaching statistics to undergraduate medical students

    Science.gov (United States)

    Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J

    2008-01-01

    Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599

  6. Statistical thinking and knowledge management for quality-driven design and manufacturing in pharmaceuticals.

    Science.gov (United States)

    Korakianiti, Evdokia; Rekkas, Dimitrios

    2011-07-01

    The purpose of this article is to present the evolution of quality principles and how they have been implemented in the pharmaceutical industry. The article discusses the challenges that the FDA PAT Guidance and the ICH Q8, Q9 and Q10 Guidelines present to industry and provides a comprehensive overview of the basic tools that can be used to effectively build quality into products. The principles of the design of experiments, the main tools for statistical process analysis and control, and the requisite culture change necessary to facilitate statistical, knowledge-based management are also addressed.

  7. Statistical Power in Experimental Audit Studies: Cautions and Calculations for Matched Tests With Nominal Outcomes

    Science.gov (United States)

    Vuolo, Mike; Uggen, Christopher; Lageson, Sarah

    2016-01-01

    Given their capacity to identify causal relationships, experimental audit studies have grown increasingly popular in the social sciences. Typically, investigators send fictitious auditors who differ by a key factor (e.g., race) to particular experimental units (e.g., employers) and then compare treatment and control groups on a dichotomous outcome…

  8. Manufacturing cereal bars with high nutritional value through experimental design

    Directory of Open Access Journals (Sweden)

    Roberta Covino

    2015-01-01

    Full Text Available Organizations responsible for public health throughout the world have been increasingly worrying about how to feed populations encouraging a nutritious and balanced diet in order to decrease the occurrence of chronic diseases, which are constantly related to an inadequate diet. Still, due to matters of modern lifestyle consumers are increasingly seeking convenient products. This being so, cereal bars have been an option when the matter is low calorie fast food which is also source of fiber. This study aimed at developing a cereal bar with high dietary fiber, iron, vitamins A and vitamin E, in order to easily enable adult population achieve the daily recommendation for such nutrients. Eight formulations plus the focal point were conducted through experimental planning; sensory analysis with 110 tasters for each block and texture. Afterwards, we conducted centesimal analysis for all three formulations presenting the best sensory results. After statistical analysis and comparison to the means for products available in the market, it was possible to conclude that the product developed presented great acceptance and fiber level more than twice as much as the means for commercial products.

  9. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  10. Experimental evaluation and design of unfilled and concrete-filled FRP composite piles : Task 4A : design specifications.

    Science.gov (United States)

    2015-08-01

    The overall goal of this project is the experimental evaluation and design of unfilled and concrete-filled FRP composite piles for load-bearing in bridges. This report covers Task 4A, Design Specifications. : Structural design specifications are base...

  11. Statistical efficiency and optimal design for stepped cluster studies under linear mixed effects models.

    Science.gov (United States)

    Girling, Alan J; Hemming, Karla

    2016-06-15

    In stepped cluster designs the intervention is introduced into some (or all) clusters at different times and persists until the end of the study. Instances include traditional parallel cluster designs and the more recent stepped-wedge designs. We consider the precision offered by such designs under mixed-effects models with fixed time and random subject and cluster effects (including interactions with time), and explore the optimal choice of uptake times. The results apply both to cross-sectional studies where new subjects are observed at each time-point, and longitudinal studies with repeat observations on the same subjects. The efficiency of the design is expressed in terms of a 'cluster-mean correlation' which carries information about the dependency-structure of the data, and two design coefficients which reflect the pattern of uptake-times. In cross-sectional studies the cluster-mean correlation combines information about the cluster-size and the intra-cluster correlation coefficient. A formula is given for the 'design effect' in both cross-sectional and longitudinal studies. An algorithm for optimising the choice of uptake times is described and specific results obtained for the best balanced stepped designs. In large studies we show that the best design is a hybrid mixture of parallel and stepped-wedge components, with the proportion of stepped wedge clusters equal to the cluster-mean correlation. The impact of prior uncertainty in the cluster-mean correlation is considered by simulation. Some specific hybrid designs are proposed for consideration when the cluster-mean correlation cannot be reliably estimated, using a minimax principle to ensure acceptable performance across the whole range of unknown values. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  12. Elicitation by design in ecology: using expert opinion to inform priors for Bayesian statistical models.

    Science.gov (United States)

    Choy, Samantha Low; O'Leary, Rebecca; Mengersen, Kerrie

    2009-01-01

    Bayesian statistical modeling has several benefits within an ecological context. In particular, when observed data are limited in sample size or representativeness, then the Bayesian framework provides a mechanism to combine observed data with other "prior" information. Prior information may be obtained from earlier studies, or in their absence, from expert knowledge. This use of the Bayesian framework reflects the scientific "learning cycle," where prior or initial estimates are updated when new data become available. In this paper we outline a framework for statistical design of expert elicitation processes for quantifying such expert knowledge, in a form suitable for input as prior information into Bayesian models. We identify six key elements: determining the purpose and motivation for using prior information; specifying the relevant expert knowledge available; formulating the statistical model; designing effective and efficient numerical encoding; managing uncertainty; and designing a practical elicitation protocol. We demonstrate this framework applies to a variety of situations, with two examples from the ecological literature and three from our experience. Analysis of these examples reveals several recurring important issues affecting practical design of elicitation in ecological problems.

  13. Quasi-experimental designs in practice-based research settings: design and implementation considerations.

    Science.gov (United States)

    Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen

    2011-01-01

    Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.

  14. MicroarrayDesigner: an online search tool and repository for near-optimal microarray experimental designs

    Directory of Open Access Journals (Sweden)

    Ferhatosmanoglu Nilgun

    2009-09-01

    Full Text Available Abstract Background Dual-channel microarray experiments are commonly employed for inference of differential gene expressions across varying organisms and experimental conditions. The design of dual-channel microarray experiments that can help minimize the errors in the resulting inferences has recently received increasing attention. However, a general and scalable search tool and a corresponding database of optimal designs were still missing. Description An efficient and scalable search method for finding near-optimal dual-channel microarray designs, based on a greedy hill-climbing optimization strategy, has been developed. It is empirically shown that this method can successfully and efficiently find near-optimal designs. Additionally, an improved interwoven loop design construction algorithm has been developed to provide an easily computable general class of near-optimal designs. Finally, in order to make the best results readily available to biologists, a continuously evolving catalog of near-optimal designs is provided. Conclusion A new search algorithm and database for near-optimal microarray designs have been developed. The search tool and the database are accessible via the World Wide Web at http://db.cse.ohio-state.edu/MicroarrayDesigner. Source code and binary distributions are available for academic use upon request.

  15. Tabletop Games: Platforms, Experimental Games and Design Recommendations

    Science.gov (United States)

    Haller, Michael; Forlines, Clifton; Koeffel, Christina; Leitner, Jakob; Shen, Chia

    While the last decade has seen massive improvements in not only the rendering quality, but also the overall performance of console and desktop video games, these improvements have not necessarily led to a greater population of video game players. In addition to continuing these improvements, the video game industry is also constantly searching for new ways to convert non-players into dedicated gamers. Despite the growing popularity of computer-based video games, people still love to play traditional board games, such as Risk, Monopoly, and Trivial Pursuit. Both video and board games have their strengths and weaknesses, and an intriguing conclusion is to merge both worlds. We believe that a tabletop form-factor provides an ideal interface for digital board games. The design and implementation of tabletop games will be influenced by the hardware platforms, form factors, sensing technologies, as well as input techniques and devices that are available and chosen. This chapter is divided into three major sections. In the first section, we describe the most recent tabletop hardware technologies that have been used by tabletop researchers and practitioners. In the second section, we discuss a set of experimental tabletop games. The third section presents ten evaluation heuristics for tabletop game design.

  16. Experimental Reality: Principles for the Design of Augmented Environments

    Science.gov (United States)

    Lahlou, Saadi

    The Laboratory of Design for Cognition at EDF R&D (LDC) is a living laboratory, which we created to develop Augmented Environment (AE) for collaborative work, more specifically “cognitive work” (white collars, engineers, office workers). It is a corporate laboratory in a large industry, where natural activity of real users is observed in a continuous manner in various spaces (project space, meeting room, lounge, etc.) The RAO room, an augmented meeting room, is used daily for “normal” meetings; it is also the “mother room” of all augmented meeting rooms in the company, where new systems, services, and devices are tested. The LDC has gathered a unique set of data on the use of AE, and developed various observation and design techniques, described in this chapter. LDC uses novel techniques of digital ethnography, some of which were invented there (SubCam, offsat) and some of which were developed elsewhere and adapted (360° video, WebDiver, etc.). At LDC, some new theories have also been developed to explain behavior and guide innovation: cognitive attractors, experimental reality, and the triple-determination framework.

  17. Experimental investigation of design parameters on dry powder inhaler performance.

    Science.gov (United States)

    Ngoc, Nguyen Thi Quynh; Chang, Lusi; Jia, Xinli; Lau, Raymond

    2013-11-30

    The study aims to investigate the impact of various design parameters of a dry powder inhaler on the turbulence intensities generated and the performance of the dry powder inhaler. The flow fields and turbulence intensities in the dry powder inhaler are measured using particle image velocimetry (PIV) techniques. In vitro aerosolization and deposition a blend of budesonide and lactose are measured using an Andersen Cascade Impactor. Design parameters such as inhaler grid hole diameter, grid voidage and chamber length are considered. The experimental results reveal that the hole diameter on the grid has negligible impact on the turbulence intensity generated in the chamber. On the other hand, hole diameters smaller than a critical size can lead to performance degradation due to excessive particle-grid collisions. An increase in grid voidage can improve the inhaler performance but the effect diminishes at high grid voidage. An increase in the chamber length can enhance the turbulence intensity generated but also increases the powder adhesion on the inhaler wall. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Numerical and experimental design of coaxial shallow geothermal energy systems

    Science.gov (United States)

    Raghavan, Niranjan

    Geothermal Energy has emerged as one of the front runners in the energy race because of its performance efficiency, abundance and production competitiveness. Today, geothermal energy is used in many regions of the world as a sustainable solution for decreasing dependence on fossil fuels and reducing health hazards. However, projects related to geothermal energy have not received their deserved recognition due to lack of computational tools associated with them and economic misconceptions related to their installation and functioning. This research focuses on numerical and experimental system design analysis of vertical shallow geothermal energy systems. The driving force is the temperature difference between a finite depth beneath the earth and its surface stimulates continuous exchange of thermal energy from sub-surface to the surface (a geothermal gradient is set up). This heat gradient is captured by the circulating refrigerant and thus, tapping the geothermal energy from shallow depths. Traditionally, U-bend systems, which consist of two one-inch pipes with a U-bend connector at the bottom, have been widely used in geothermal applications. Alternative systems include coaxial pipes (pipe-in-pipe) that are the main focus of this research. It has been studied that coaxial pipes have significantly higher thermal performance characteristics than U-bend pipes, with comparative production and installation costs. This makes them a viable design upgrade to the traditional piping systems. Analytical and numerical heat transfer analysis of the coaxial system is carried out with the help of ABAQUS software. It is tested by varying independent parameters such as materials, soil conditions and effect of thermal contact conductance on heat transfer characteristics. With the above information, this research aims at formulating a preliminary theoretical design setup for an experimental study to quantify and compare the heat transfer characteristics of U-bend and coaxial

  19. System design overview of JAXA small supersonic experimental airplane (NEXST-1)

    OpenAIRE

    Takami, Hikaru; 高見 光

    2007-01-01

    The system of JAXA small supersonic experimental airplane (NEXST-1: National EXperimental Supersonic Transport-1) has been briefly explained. Some design problems that the designers have encountered have also been briefly explained.

  20. Estimating Intervention Effects across Different Types of Single-Subject Experimental Designs: Empirical Illustration

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Onghena, Patrick; Heyvaert, Mieke; Beretvas, S. Natasha; Van den Noortgate, Wim

    2015-01-01

    The purpose of this study is to illustrate the multilevel meta-analysis of results from single-subject experimental designs of different types, including AB phase designs, multiple-baseline designs, ABAB reversal designs, and alternating treatment designs. Current methodological work on the meta-analysis of single-subject experimental designs…

  1. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    OpenAIRE

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these classic assumptions in simulation practice?(ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions hold?(i...

  2. A statistical design for testing transgenerational genomic imprinting in natural human populations.

    Directory of Open Access Journals (Sweden)

    Yao Li

    Full Text Available Genomic imprinting is a phenomenon in which the same allele is expressed differently, depending on its parental origin. Such a phenomenon, also called the parent-of-origin effect, has been recognized to play a pivotal role in embryological development and pathogenesis in many species. Here we propose a statistical design for detecting imprinted loci that control quantitative traits based on a random set of three-generation families from a natural population in humans. This design provides a pathway for characterizing the effects of imprinted genes on a complex trait or disease at different generations and testing transgenerational changes of imprinted effects. The design is integrated with population and cytogenetic principles of gene segregation and transmission from a previous generation to next. The implementation of the EM algorithm within the design framework leads to the estimation of genetic parameters that define imprinted effects. A simulation study is used to investigate the statistical properties of the model and validate its utilization. This new design, coupled with increasingly used genome-wide association studies, should have an immediate implication for studying the genetic architecture of complex traits in humans.

  3. Statistical Issues in the Design and Analysis of nCounter Projects.

    Science.gov (United States)

    Jung, Sin-Ho; Sohn, Insuk

    2014-01-01

    Numerous statistical methods have been published for designing and analyzing microarray projects. Traditional genome-wide microarray platforms (such as Affymetrix, Illumina, and DASL) measure the expression level of tens of thousands genes. Since the sets of genes included in these array chips are selected by the manufacturers, the number of genes associated with a specific disease outcome is limited and a large portion of the genes are not associated. nCounter is a new technology by NanoString to measure the expression of a selected number (up to 800) of genes. The list of genes for nCounter chips can be selected by customers. Due to the limited number of genes and the price increase in the number of selected genes, the genes for nCounter chips are carefully selected among those discovered from previous studies, usually using traditional high-throughput platforms, and only a small number of definitely unassociated genes, called control genes, are included to standardize the overall expression level across different chips. Furthermore, nCounter chips measure the expression level of each gene using a counting observation while the traditional high-throughput platforms produce continuous observations. Due to these differences, some statistical methods developed for the design and analysis of high-throughput projects may need modification or may be inappropriate for nCounter projects. In this paper, we discuss statistical methods that can be used for designing and analyzing nCounter projects.

  4. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    Science.gov (United States)

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a

  5. Statistical issues in the design, conduct and analysis of two large safety studies.

    Science.gov (United States)

    Gaffney, Michael

    2016-10-01

    The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.

  6. Experimental design for efficient identification of gene regulatory networks using sparse Bayesian models

    Directory of Open Access Journals (Sweden)

    Tsuda Koji

    2007-11-01

    Full Text Available Abstract Background Identifying large gene regulatory networks is an important task, while the acquisition of data through perturbation experiments (e.g., gene switches, RNAi, heterozygotes is expensive. It is thus desirable to use an identification method that effectively incorporates available prior knowledge – such as sparse connectivity – and that allows to design experiments such that maximal information is gained from each one. Results Our main contributions are twofold: a method for consistent inference of network structure is provided, incorporating prior knowledge about sparse connectivity. The algorithm is time efficient and robust to violations of model assumptions. Moreover, we show how to use it for optimal experimental design, reducing the number of required experiments substantially. We employ sparse linear models, and show how to perform full Bayesian inference for these. We not only estimate a single maximum likelihood network, but compute a posterior distribution over networks, using a novel variant of the expectation propagation method. The representation of uncertainty enables us to do effective experimental design in a standard statistical setting: experiments are selected such that the experiments are maximally informative. Conclusion Few methods have addressed the design issue so far. Compared to the most well-known one, our method is more transparent, and is shown to perform qualitatively superior. In the former, hard and unrealistic constraints have to be placed on the network structure for mere computational tractability, while such are not required in our method. We demonstrate reconstruction and optimal experimental design capabilities on tasks generated from realistic non-linear network simulators. The methods described in the paper are available as a Matlab package at http://www.kyb.tuebingen.mpg.de/sparselinearmodel.

  7. Experimental design strategy a part of an innovative construction industry

    NARCIS (Netherlands)

    Maarten Gjaltema; Rogier Laterveer; Dr Ruben Vrijhoef

    2013-01-01

    From the article: Abstract. This exploratory and conceptual article sets out to research what arguments and possibilities for experimentation in construction exists and if experimentation can contribute towards more innovative construction as a whole. Traditional, -western- construction is very

  8. Experimental Design Strategy As Part of an Innovative Construction Industry

    NARCIS (Netherlands)

    Rogier Laterveer

    2013-01-01

    This exploratory and conceptual article sets out to research what arguments and possibilities for experimentation in construction exists and if experimentation can contribute towards more innovative construction as a whole. Traditional, -western- construction is very conservative and regional, often

  9. Application of Plackett-Burman experimental design and Doehlert design to evaluate nutritional requirements for xylanase production by Alternaria mali ND-16.

    Science.gov (United States)

    Li, Yin; Liu, Zhiqiang; Cui, Fengjie; Liu, Zhisheng; Zhao, Hui

    2007-11-01

    The objective of this study was to use statistically based experimental designs for the optimization of xylanase production from Alternaria mali ND-16. Ten components in the medium were screened for nutritional requirements. Three nutritional components, including NH(4)Cl, urea, and MgSO(4), were identified to significantly affect the xylanase production by using the Plackett-Burman experimental design. These three major components were subsequently optimized using the Doehlert experimental design. By using response surface methodology and canonical analysis, the optimal concentrations for xylanase production were: NH(4)Cl 11.34 g L(-1), urea 1.26 g L(-1), and MgSO(4) 0.98 g L(-1). Under these optimal conditions, the xylanase activity from A. mali ND-16 reached 30.35 U mL(-1). Verification of the optimization showed that xylanase production of 31.26 U mL(-1) was achieved.

  10. Phosphate removal from water by fly ash: factorial experimental design.

    Science.gov (United States)

    Can, Mevra Yalvac; Yildiz, Ergun

    2006-07-31

    The influence of three variables (phophate concentration, initial pH of solution (pH(0)) and the fly ash dosage) on the removal efficiency of phosphate (% E) and equilibrium pH of solution (pH(eq)) by using fly ash was studied by means of 2(3) full factorial experimental designs. The parameters coded as x(1), x(2) and x(3), consecutively(,) were used. The parameters were investigated at two levels (-1 and 1). The effects of these factors on dependent variables, namely, % E and pH(eq) were investigated. To determine the significance of effects, the analysis of variance with 95% confidence limits was used. It was shown that % E and pH(eq) obtained in this study were found to be 99.6% and 11.16, corresponding to the operating condition of 25 mg l(-1), 2 g l(-1) and 5.5 for the phosphate concentration, fly ash dosage and pH(0), respectively.

  11. Tokamak experimental power reactor conceptual design. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    1976-08-01

    A conceptual design has been developed for a tokamak Experimental Power Reactor to operate at net electrical power conditions with a plant capacity factor of 50 percent for 10 years. The EPR operates in a pulsed mode at a frequency of approximately 1/min., with an approximate 75 percent duty cycle, is capable of producing approximately 72 MWe and requires 42 MWe. The annual tritium consumption is 16 kg. The EPR vacuum chamber is 6.25 m in major radius and 2.4 m in minor radius, is constructed of 2-cm thick stainless steel, and has 2-cm thick detachable, beryllium-coated coolant panels mounted on the interior. An 0.28 m stainless steel blanket and a shield ranging from 0.6 to 1.0 m surround the vacuum vessel. The coolant is H/sub 2/O. Sixteen niobium-titanium superconducting toroidal-field coils provide a field of 10 T at the coil and 4.47 T at the plasma. Superconducting ohmic-heating and equilibrium-field coils provide 135 V-s to drive the plasma current. Plasma heating is accomplished by 12 neutral beam-injectors, which provide 60 MW. The energy transfer and storage system consists of a central superconducting storage ring, a homopolar energy storage unit, and a variety of inductor-converters.

  12. Application of statistical design of experiments for the optimization of floor tile glaze formulation

    Directory of Open Access Journals (Sweden)

    Sousan Rasouli

    2017-06-01

    Full Text Available This paper investigates the influence of the various parameters on the quality of industrial Iranian floor tile glaze by using the experimental designs analysis, Taguchi Method. A commercial grade of engobe and green body from one of the national tile companies have been used. Three factors namely: particle size of glaze slurry, sintering time and temperature were selected to identify the influence of these factors on the quality of glaze. A Taguchi L8 Orthogonal Arrays, fractional factorial design, was used to optimize experimental trials. This approach successfully categorized the effect of each variable using only 8 experimental trials and identified the most important variables affecting this glaze making process with the analysis of variance (ANOVA. The quality control tests of floor tile such as thermal shock resistance, specking resistance and surface hardness were carried out according to the existing Iranian standard. The optimized samples were obtained by factorial design analysis; taking into account coarser particle size, higher temperature and less time. The optimized sample was counter checked using Taguchi method and by selecting effective factors in high levels. It was demonstrated that, the particle size of the slurry is the only significant parameter and sample with high level in particle size and temperature of sintering is the best sample according to existing standard of floor tile.

  13. Application of Statistical Design to the Optimization of Culture Medium for Prodigiosin Production by Serratia marcescens SWML08

    Directory of Open Access Journals (Sweden)

    Venil, C. K.

    2009-01-01

    Full Text Available Combination of Plackett – Burman design (PBD and Box – Behnken design (BBD were applied for optimization of different factors for prodigiosin production by Serratia marcescens SWML08. Among 11 factors, incubation temperature, and supplement of (NH42PO4 and trace salts into the culture medium were selected due to significant positive effect on prodigiosin yield. Box - Behnken design, a response surface methodology, was used for further optimization of these selected factors for better prodigiosin output. Data were analyzed step wise and a second order polynomial model was established to identify the relationship between the prodigiosin output and the selected factors. The media formulations were optimized having the factors such as incubation temperature 30 °C, (NH42PO4 6 g/L and trace salts 0.6 g/L. The maximum experimental response for prodigiosin production was 1397.96 mg/L whereas the predicted value was 1394.26 mg/L. The high correlation between the predicted and observed values indicated the validity of the statistical design.

  14. Batch phenol biodegradation study and application of factorial experimental design

    Directory of Open Access Journals (Sweden)

    A. Hellal

    2010-01-01

    Full Text Available A bacterium, Pseudomonas aeruginosa (ATTC27853, was investigated for its ability to grow and to degrade phenol as solecarbon source, in aerobic batch culture. The parameters which affect the substrate biodegradation such as the adaptation ofbacteria to phenol, the temperature, and the nature of the bacteria were investigated. The results show that for a range oftemperature of 30 to 40°C, the best degradation of phenol for a concentration of 100mg/l was observed at 30°C. The regenerationof the bacterium which allows the reactivation of its enzymatic activity, shows that the degradation of 100 mg/ l ofsubstrate at 30° C required approximately 50 hours with revivified bacteria, while it only starts after 72 hours for those norevivified. Adapted to increasing concentrations, allows the bacteria to degrade a substrate concentration of about 400mg/l in less than 350 hours.A second part was consisted in the determination of a substrate degradation model using the factorial experiment design,as a function of temperature (30-40°C and of the size of the inoculums (260.88 - 521.76mg/ l. The results were analyzedstatistically using the Student’s t-test, analysis of variance, and F-test. The value of R2 (0.99872 and adjusted R2 (0.9962close to 1.0, verifies the good correlation between the observed and the predicted values, and provides the excellent relationshipbetween the independent variables (factors and the response (the time of the phenol degradation. F-value found above200, indicates that the considered model is statistically significant.

  15. Statistical optimization of dithranol-loaded solid lipid nanoparticles using factorial design

    Directory of Open Access Journals (Sweden)

    Makarand Suresh Gambhire

    2011-09-01

    Full Text Available This study describes a 3² full factorial experimental design to optimize the formulation of dithranol (DTH loaded solid lipid nanoparticles (SLN by the pre-emulsion ultrasonication method. The variables drug: lipid ratio and sonication time were studied at three levels and arranged in a 3² factorial design to study the influence on the response variables particle size and % entrapment efficiency (%EE. From the statistical analysis of data polynomial equations were generated. The particle size and %EE for the 9 batches (R1 to R9 showed a wide variation of 219-348 nm and 51.33- 71.80 %, respectively. The physical characteristics of DTH-loaded SLN were evaluated using a particle size analyzer, differential scanning calorimetry and X-ray diffraction. The results of the optimized formulation showed an average particle size of 219 nm and entrapment efficiency of 69.88 %. Ex-vivo drug penetration using rat skin showed about a 2-fold increase in localization of DTH in skin as compared to the marketed preparation of DTH.Este estudo descreve o planejamento factorial 3² para otimizar a formulação de nanopartículas lipídicas sólidas (SLN carregadas com ditranol (DTH pelo método da ultrassonificação pré-emulsão. As variáveis como proporção de fármaco:lipídio e o tempo de sonicação foram estudados em três níveis e arranjados em planejamento fatorial 3² para estudar a influência nas variáveis de resposta tamanho de partícula e eficiência percentual de retenção do fármaco (%EE. Pela análise estatística, geraram-se equações polinomiais. O tamanho da partícula e a %EE para os 9 lotes (R1 a R9 mostraram ampla variação, respectivamente, 219-348 nm e 51,33-71,80%. As características físicas das SLN carregadas com DTN foram avaliadas utilizando-se analisador de tamanho de partícula, calorimetria de varredura diferencial e difração de raios X. Os resultados da formulação otimizada mostraram tamanho médio de partícula de

  16. Approach toward enhancement of halophilic protease production by Halobacterium sp. strain LBU50301 using statistical design response surface methodology

    Directory of Open Access Journals (Sweden)

    Julalak Chuprom

    2016-06-01

    Full Text Available A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples (budu and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT approach determined gelatin was the best nitrogen source. Based on Plackett–Burman (PB experimental design; gelatin, MgSO4·7H2O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL was obtained, compared with that produced in the original medium (17.80 U/mL. Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL.

  17. Optimization of Conversion Treatment on Austenitic Stainless Steel Using Experimental Designs

    Directory of Open Access Journals (Sweden)

    S. El Hajjaji

    2013-01-01

    Full Text Available Conversion coating is commonly used as treatment to improve the adherence of ceramics films. The conversion coating properties depend on the structure of alloy as well as on the treatment parameters. These conversion coatings must be characterized by strong interfacial adhesion, high roughness, and high real surface area, which were measured by an electrochemical method. The influence of all the elaboration factors (temperature, time, and bath composition: sulphuric acid, thiosulphate as accelerator, propargyl alcohol as inhibitor, and surface state and also the interactions between these factors were evaluated, using statistical experimental design. The specific surface area and optical factor (α correspond to the quantitative responses. The evaluation showed, by using a designed experimental procedure, that the most important factor was “surface state.” Sanded surface allows the formation of conversion coating with high real surface area. A further aim was to optimise two parameters: treatment time and temperature using Doehlert shell design and simplex method. The growth of the conversion coating is also influenced by treatment time and temperature. With such optimized conditions, the real surface area of conversion coating obtained was about 235 m2/m2.

  18. Experimental study of the statistics of slipping events in solid friction

    Directory of Open Access Journals (Sweden)

    S. Ciliberto

    1995-01-01

    Full Text Available The statistics of the stick-slip motion is studied in two experiments, where elasticity is distributed either on a surface or on a volume. The rough surface is realised by embedding steel spheres in an elastic substrate whereas the volume is constituted by several layers of rubber spheres. Both systems produce a complex dynamics characterized by power law with non-trivial exponents in the distribution of the amplitude of the slipping events and in the power spectra of the friction force time evolution. The dependence of the results on the system size is also studied.

  19. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism.

    Science.gov (United States)

    Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-04-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.

  20. Review of research designs and statistical methods employed in dental postgraduate dissertations.

    Science.gov (United States)

    Shirahatti, Ravi V; Hegde-Shetiya, Sahana

    2015-01-01

    There is a need to evaluate the quality of postgraduate dissertations of dentistry submitted to university in the light of the international standards of reporting. We conducted the review with an objective to document the use of sampling methods, measurement standardization, blinding, methods to eliminate bias, appropriate use of statistical tests, appropriate use of data presentation in postgraduate dental research and suggest and recommend modifications. The public access database of the dissertations from Rajiv Gandhi University of Health Sciences was reviewed. Three hundred and thirty-three eligible dissertations underwent preliminary evaluation followed by detailed evaluation of 10% of randomly selected dissertations. The dissertations were assessed based on international reporting guidelines such as strengthening the reporting of observational studies in epidemiology (STROBE), consolidated standards of reporting trials (CONSORT), and other scholarly resources. The data were compiled using MS Excel and SPSS 10.0. Numbers and percentages were used for describing the data. The "in vitro" studies were the most common type of research (39%), followed by observational (32%) and experimental studies (29%). The disciplines conservative dentistry (92%) and prosthodontics (75%) reported high numbers of in vitro research. Disciplines oral surgery (80%) and periodontics (67%) had conducted experimental studies as a major share of their research. Lacunae in the studies included observational studies not following random sampling (70%), experimental studies not following random allocation (75%), not mentioning about blinding, confounding variables and calibrations in measurements, misrepresenting the data by inappropriate data presentation, errors in reporting probability values and not reporting confidence intervals. Few studies showed grossly inappropriate choice of statistical tests and many studies needed additional tests. Overall observations indicated the need to

  1. Statistical inferences for data from studies conducted with an aggregated multivariate outcome-dependent sample design.

    Science.gov (United States)

    Lu, Tsui-Shan; Longnecker, Matthew P; Zhou, Haibo

    2017-03-15

    Outcome-dependent sampling (ODS) scheme is a cost-effective sampling scheme where one observes the exposure with a probability that depends on the outcome. The well-known such design is the case-control design for binary response, the case-cohort design for the failure time data, and the general ODS design for a continuous response. While substantial work has been carried out for the univariate response case, statistical inference and design for the ODS with multivariate cases remain under-developed. Motivated by the need in biological studies for taking the advantage of the available responses for subjects in a cluster, we propose a multivariate outcome-dependent sampling (multivariate-ODS) design that is based on a general selection of the continuous responses within a cluster. The proposed inference procedure for the multivariate-ODS design is semiparametric where all the underlying distributions of covariates are modeled nonparametrically using the empirical likelihood methods. We show that the proposed estimator is consistent and developed the asymptotically normality properties. Simulation studies show that the proposed estimator is more efficient than the estimator obtained using only the simple-random-sample portion of the multivariate-ODS or the estimator from a simple random sample with the same sample size. The multivariate-ODS design together with the proposed estimator provides an approach to further improve study efficiency for a given fixed study budget. We illustrate the proposed design and estimator with an analysis of association of polychlorinated biphenyl exposure to hearing loss in children born to the Collaborative Perinatal Study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Experimental design issues for the early detection of disease: novel designs.

    Science.gov (United States)

    Hu, Ping; Zelen, Marvin

    2002-09-01

    This paper investigates two experimental designs which have been used to evaluate the benefit of the early detection of breast cancer. They have some advantages over a classical design (the screening program versus usual medical care) in that subjects in a control group may benefit by participating in the study. We refer to the two experimental designs as the up-front (UFD) and close-out (COD) designs. The UFD consists of offering an initial exam to all participants. Then they can be randomized to a usual care group or a screening group receiving one or more special examinations. If the outcome of the initial examination is included in the analysis, then the study can answer the question of the benefit of an additional screening program after an initial examination. If the analysis excludes all the cases diagnosed at the initial examination, then the analysis evaluates the benefit of a screening program after elimination of the prevalent cases. These prevalent cases are most likely to be affected by length bias sampling and consequently will tend to have less aggressive disease and live longer. As a result, the UFD can answer two scientific questions. The COD consists of randomizing subjects to a usual care group and a screened group. However, the usual care group receives an examination which coincides at the time of the last exam in the study group. In this paper the power of these two designs have been evaluated. In both cases the power is severely reduced compared to the usual control group receiving no special exams. The power is a function of the sensitivity of the exam, the number and spacings of the exams given to the screened group as well as the sample size, disease incidence of the population and the survival distribution. The theoretical results on power are applied to the Canadian National Breast Cancer Study (ages 40-49) which used an UFD and the Stockholm Mammography Breast Cancer Screening Trial which utilized a COD.

  3. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

    Directory of Open Access Journals (Sweden)

    Donald Laming

    2010-04-01

    Full Text Available This paper presents, first, a formal exploration of the relationships between information (statistically defined, statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a the human operator as an ideal communications channel, (b the human operator as a purely physical system, and (c Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.

  4. Experimental Design on Laminated Veneer Lumber Fiber Composite: Surface Enhancement

    Science.gov (United States)

    Meekum, U.; Mingmongkol, Y.

    2010-06-01

    Thick laminate veneer lumber(LVL) fibre reinforced composites were constructed from the alternated perpendicularly arrayed of peeled rubber woods. Glass woven was laid in between the layers. Native golden teak veneers were used as faces. In house formulae epoxy was employed as wood adhesive. The hand lay-up laminate was cured at 150° C for 45 mins. The cut specimen was post cured at 80° C for at least 5 hours. The 2k factorial design of experimental(DOE) was used to verify the parameters. Three parameters by mean of silane content in epoxy formulation(A), smoke treatment of rubber wood surface(B) and anti-termite application(C) on the wood surface were analysed. Both low and high levels were further subcategorised into 2 sub-levels. Flexural properties were the main respond obtained. ANOVA analysis of the Pareto chart was engaged. The main effect plot was also testified. The results showed that the interaction between silane quantity and termite treatment is negative effect at high level(AC+). Vice versa, the interaction between silane and smoke treatment was positive significant effect at high level(AB+). According to this research work, the optimal setting to improve the surface adhesion and hence flexural properties enhancement were high level of silane quantity, 15% by weight, high level of smoked wood layers, 8 out of 14 layers, and low anti termite applied wood. The further testes also revealed that the LVL composite had superior properties that the solid woods but slightly inferior in flexibility. The screw withdrawn strength of LVL showed the higher figure than solid wood. It is also better resistance to moisture and termite attack than the rubber wood.

  5. Experimental analysis of the CHP performance of a PEMFC stack by a 2 4 factorial design

    Science.gov (United States)

    Torchio, M. F.; Santarelli, M. G.; Nicali, A.

    The aim of the paper is the experimental analysis, through a statistical methodology, of the effects of the main stack operation independent variables on the cogenerative performance of a PEMFC stack. The tests have been performed on a ZSW 0184 stack of 800 W using the GreenLight GEN VI FC Test Station. The statistical methodology used for the experimental data analysis has been the factorial design: we have analysed the significance of the main operation independent variables (anode and; cathode flow dew point temperature; cathode flow stoichiometry) considering their single and combined effects on the electric and thermal power which could be recovered from the stack. The results show that the anode flow inlet temperature and the cathode flow dew point temperature have no significant effect at every level of current density both for electric and thermal power. At the same time, the cathode flow inlet temperature has a significant positive effect on the electric power at every level of current density, and especially the cathode flow stoichiometry shows significant positive effects on the electric power and negative effects on the thermal power recovered from the stack.

  6. Design and performance of statistical process control charts applied to estrous detection efficiency.

    Science.gov (United States)

    de Vries, A; Conlin, B J

    2003-06-01

    Statistical process control (SPC) charts to monitor production processes have not been widely used in dairy management. Shewhart and cumulative sum (cusum) control charts were designed to determine true changes in estrous detection efficiency (EDE) amidst normal variation in dairy cattle. A stochastic simulation model was used to track performance over time of individual cows in herds of 100 and 1000 cows. Estrous detection ratios (EDR), calculated as observed estruses divided by estimated estrous days (in periods of 1 to 60 d), were used to monitor EDE. Control charts for EDR, using normal and binomial distributions, were designed at 0.65 EDE for both herd sizes; then EDE was set to 0.65 (no change), 0.55, 0.45, or 0.35 and average days to the first detection signal (ATS) in 400 runs was determined. Observed ATS at 0.65 EDE could differ from the target ATS, depending on the SPC chart design and estimated proportions of estrous days for inseminated cows. Observed ATS were shorter for larger changes in EDE and for the 1000-cow herd. Observed ATS for a change to 0.55 EDE were approximately 300 d (100 cows) or 60 d (1000 cows) with the cusum charts. For a change to 0.35 EDE, observed ATS were approximately 50 d (100 cows) and approximately 11 d (1000 cows). Shewhart charts performed similarly or took longer to signal changes depending on period length. Observed ATS on cusum charts were much longer than minimum when non-optimal reference values were used in the design. Observed ATS were also longer when SPC charts were designed with a longer target ATS and change in EDE was small. Control charts using normal and binomial distributions generally performed similarly. Statistical process control charts detected changes in estrous detection efficiency soon enough to be potentially useful in dairy management.

  7. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Experimental Salix shoot and root growth statistics on the alluvial sediment of a restored river corridor

    Science.gov (United States)

    Pasquale, N.; Perona, P.; Verones, F.; Francis, R.; Burlando, P.

    2009-12-01

    River restoration projects encompass not only the amelioration of flood protection but also the rehabilitation of the riverine ecosystem. However, the interactions and feedbacks between river hydrology, riparian vegetation and aquifer dynamics are still poorly understood. Vegetation interacts with river hydrology on multiple time scales. Hence, there is considerable interest in understanding the morphodynamics of restored river reaches in relation to the characteristics of vegetation that may colonize the bare sediment, and locally stabilize it by root anchoring. In this paper we document results from a number of ongoing experiments within the project RECORD (Restored CORridor Dynamics, sponsored by CCES - www.cces.ch - and Cantons Zurich and Thurgau, CH). In particular, we discuss both the above and below ground biomass growth dynamics of 1188 Salix cuttings (individual and group survival rate, growth of the longest shoots and number of branches and morphological root analysis) in relation to local river hydrodynamics. Cuttings were organized in square plots of different size and planted in spring 2009 on a gravel island of the restored river section of River Thur in Switzerland. By periodical monitoring the plots we obtained a detailed and quite unique set of data, including root statistics of uprooted samples derived from image analysis from a high-resolution scanner. Beyond describing the survival rate dynamics in relation to river hydrology, we show the nature and strength of correlations between island topography and cutting growth statistics. In particular, by root analysis and by comparing empirical histograms of the vertical root distribution vs satured water surface in the sediment, we show that main tropic responses on such environment are oxytropism, hydrotropism and thigmotropism. The main factor influencing the survival rate is naturally found in erosion by floods, of which we also give an interesting example that helps demonstrate the role of river

  9. Experimental and numerical analysis for optimal design parameters ...

    Indian Academy of Sciences (India)

    Rajneesh Kaushal

    Finally, the relations established are confirmed experimentally to validate the models. The relations thus established are ... Additionally, the present study is among the first attempts to reveal the effect of humidity on the performance of falling film evaporator. .... and accuracies of instruments used. 2.2 Experimental procedure.

  10. 14 CFR 437.85 - Allowable design changes; modification of an experimental permit.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Allowable design changes; modification of... Conditions of an Experimental Permit § 437.85 Allowable design changes; modification of an experimental permit. (a) The FAA will identify in the experimental permit the type of changes that the permittee may...

  11. A case study on the design and development of minigames for research methods and statistics

    Directory of Open Access Journals (Sweden)

    P. Van Rosmalen

    2014-08-01

    Full Text Available Research methodology involves logical reasoning and critical thinking skills which are core competences in developing a more sophisticated understanding of the world. Acquiring expertise in research methods and statistics is not easy and poses a significant challenge for many students. The subject material is challenging because it is highly abstract and complex and requires the coordination of different but inter-related knowledge and skills that are all necessary to develop a coherent and usable skills base in this area. Additionally, while many students embrace research methods enthusiastically, others find the area dry, abstract and boring. In this paper we discuss the design and the first evaluation of a set of mini-games to practice research methods. Games are considered to be engaging and allow students to test out scenarios which provide concrete examples in a way that they typically only do once they are out in the field. The design of a game is a complex task. First, we describe how we used cognitive task analysis to identify the knowledge and competences required to develop a comprehensive and usable understanding of research methods. Next, we describe the games designed and how 4C-ID, an instructional design model, was used to underpin the games with a sound instructional design basis. Finally, the evaluation approach is discussed and how the findings of the first evaluation phase were used to improve the games.

  12. Antileishmanial chalcones: statistical design, synthesis, and three-dimensional quantitative structure-activity relationship analysis

    DEFF Research Database (Denmark)

    Nielsen, S F; Christensen, S B; Cruciani, G

    1998-01-01

    A large number of substituted chalcones have been synthesized and tested for antileishmanial and lymphocyte-suppressing activities. A subset of the chalcones was designed by using statistical methods. 3D-QSAR analyses using 67 (antileishmanial activity) and 63 (lymphocyte-suppressing activity...... of high quality (lymphocyte-suppressing model, R2 = 0. 90, Q2 = 0.80; antileishmanial model, R2 = 0.73, Q2 = 0.63). The coefficient plots indicate that steric interactions between the chalcones and the target are of major importance for the potencies of the compounds. A comparison of the coefficient plots...... for the antileishmanial effect and the lymphocyte-suppressing activity discloses significant differences which should make it possible to design chalcones having a high antileishmanial activity without suppressing the proliferation of lymphocytes....

  13. On the impact of non-Gaussian wind statistics on wind turbines - an experimental approach

    Science.gov (United States)

    Schottler, Jannik; Reinke, Nico; Hoelling, Agnieszka; Whale, Jonathan; Peinke, Joachim; Hoelling, Michael

    2016-11-01

    The effect of intermittent and Gaussian inflow conditions on wind energy converters is studied experimentally. Two different flow situations were created in a wind tunnel using an active grid. Both flows exhibit nearly equal mean velocity values and turbulence intensities, but strongly differ in their two point uτ = u (t + τ) - u (t) on a variety of time scales τ, one being Gaussian distributed, the other one being strongly intermittent. A horizontal axis model wind turbine is exposed to both flows, isolating the effect of the differences not captured by mean values and turbulence intensities on the turbine. Thrust, torque and power data were recorded and analyzed, showing that the model turbine does not smooth out intermittency. Intermittent inflow is converted to similarly intermittent turbine data on all scales considered, reaching down to sub-rotor scales in space, indicating that it is not correct to assume a smoothing of wind speed fluctuations below the size of the rotor.

  14. Prediction of free air space in initial composting mixtures by a statistical design approach.

    Science.gov (United States)

    Soares, Micaela A R; Quina, Margarida J; Quinta-Ferreira, Rosa

    2013-10-15

    Free air space (FAS) is a physical parameter that can play an important role in composting processes to maintain favourable aerobic conditions. Aiming to predict the FAS of initial composting mixtures, specific materials proportions ranged from 0 to 1 were tested for a case study comprising industrial potato peel, which is characterized by low air void volume, thus requiring additional components for its composting. The characterization and prediction of FAS for initial mixtures involving potato peel, grass clippings and rice husks (set A) or sawdust (set B) was accomplished by means of an augmented simplex-centroid mixture design approach. The experimental data were fitted to second order Scheffé polynomials. Synergistic or antagonistic effects of mixture proportions in the FAS response were identified from the surface and response trace plots in the FAS response. Moreover, a good agreement was achieved between the model predictions and supplementary experimental data. Moreover, theoretical and empirical approaches for estimating FAS available in literature were compared with the predictions generated by the mixture design approach. This study demonstrated that the mixture design methodology can be a valuable tool to predict the initial FAS of composting mixtures, specifically in making adjustments to improve composting processes containing primarily potato peel. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Antibacterial activity of Limonium brasiliense (Baicuru) against multidrug-resistant bacteria using a statistical mixture design.

    Science.gov (United States)

    Blainski, Andressa; Gionco, Barbara; Oliveira, Admilton G; Andrade, Galdino; Scarminio, Ieda S; Silva, Denise B; Lopes, Norberto P; Mello, João C P

    2017-02-23

    Limonium brasiliense (Boiss.) Kuntze (Plumbaginaceae) is commonly known as "baicuru" or "guaicuru" and preparations of its dried rhizomes have been popularly used in the treatment of premenstrual syndrome and menstrual disorder, and as an antiseptic in genito-urinary infections. This study evaluated the potential antibacterial activity of rhizome extracts against multidrug-resistant bacterial strains using statistical mixture design. The statistical design of four components (water, methanol, acetone, and ethanol) produced 15 different extracts and also a confirmatory experiment, which was performed using water:acetone (3:7, v/v). The crude extracts and their ethyl-acetate fractions were tested against vancomycin-resistant Enterococcus faecium (VREfm), methicillin-resistant Staphylococcus aureus (MRSA) and Klebsiella pneumoniae carbapenemase (KPC)-producing K. pneumoniae, all of which have been implicated in hospital and community-acquired infections. The dry residue, total polyphenol, gallocatechin and epigallocatechin contents of the extracts were also tested and statistical analysis was applied in order to define the fit models to predict the result of each parameter for any mixture of components. The principal component and hierarchical clustering analyses (PCA and HCA) of chromatographic data, as well as mass spectrometry (MS) analysis were performanced to determine the main compounds present in the extracts. The Gram-positive bacteria were susceptible to inhibition of bacterial growth, in special the ethyl-acetate fraction of ternary extracts from water:acetone:ethanol and methanol:acetone:ethanol against, respectively, VREfm (MIC=19µg/mL) and MRSA (MIC=39µg/mL). On the other hand, moderate activity of the ethyl-acetate fractions from primary (except water), secondary and ternary extracts (MIC=625µg/mL) was noted against KPC. The quadratic and special cubic models were significant for polyphenols and gallocatechin contents, respectively. Fit models to dry

  16. Ensemble engineering and statistical modeling for parameter calibration towards optimal design of microbial fuel cells

    Science.gov (United States)

    Sun, Hongyue; Luo, Shuai; Jin, Ran; He, Zhen

    2017-07-01

    Mathematical modeling is an important tool to investigate the performance of microbial fuel cell (MFC) towards its optimized design. To overcome the shortcoming of traditional MFC models, an ensemble model is developed through integrating both engineering model and statistical analytics for the extrapolation scenarios in this study. Such an ensemble model can reduce laboring effort in parameter calibration and require fewer measurement data to achieve comparable accuracy to traditional statistical model under both the normal and extreme operation regions. Based on different weight between current generation and organic removal efficiency, the ensemble model can give recommended input factor settings to achieve the best current generation and organic removal efficiency. The model predicts a set of optimal design factors for the present tubular MFCs including the anode flow rate of 3.47 mL min-1, organic concentration of 0.71 g L-1, and catholyte pumping flow rate of 14.74 mL min-1 to achieve the peak current at 39.2 mA. To maintain 100% organic removal efficiency, the anode flow rate and organic concentration should be controlled lower than 1.04 mL min-1 and 0.22 g L-1, respectively. The developed ensemble model can be potentially modified to model other types of MFCs or bioelectrochemical systems.

  17. Experimental design, modeling and optimization of polyplex formation between DNA oligonucleotides and branched polyethylenimine.

    Science.gov (United States)

    Clima, Lilia; Ursu, Elena L; Cojocaru, Corneliu; Rotaru, Alexandru; Barboiu, Mihail; Pinteala, Mariana

    2015-09-28

    The complexes formed by DNA and polycations have received great attention owing to their potential application in gene therapy. In this study, the binding efficiency between double-stranded oligonucleotides (dsDNA) and branched polyethylenimine (B-PEI) has been quantified by processing of the images captured from the gel electrophoresis assays. The central composite experimental design has been employed to investigate the effects of controllable factors on the binding efficiency. On the basis of experimental data and the response surface methodology, a multivariate regression model has been constructed and statistically validated. The model has enabled us to predict the binding efficiency depending on experimental factors, such as concentrations of dsDNA and B-PEI as well as the initial pH of solution. The optimization of the binding process has been performed using simplex and gradient methods. The optimal conditions determined for polyplex formation have yielded a maximal binding efficiency close to 100%. In order to reveal the mechanism of complex formation at the atomic-scale, a molecular dynamic simulation has been carried out. According to the computation results, B-PEI amine hydrogen atoms have interacted with oxygen atoms from dsDNA phosphate groups. These interactions have led to the formation of hydrogen bonds between macromolecules, stabilizing the polyplex structure.

  18. Artificial Warming of Arctic Meadow under Pollution Stress: Experimental design

    Science.gov (United States)

    Moni, Christophe; Silvennoinen, Hanna; Fjelldal, Erling; Brenden, Marius; Kimball, Bruce; Rasse, Daniel

    2014-05-01

    Boreal and arctic terrestrial ecosystems are central to the climate change debate, notably because future warming is expected to be disproportionate as compared to world averages. Likewise, greenhouse gas (GHG) release from terrestrial ecosystems exposed to climate warming is expected to be the largest in the arctic. Artic agriculture, in the form of cultivated grasslands, is a unique and economically relevant feature of Northern Norway (e.g. Finnmark Province). In Eastern Finnmark, these agro-ecosystems are under the additional stressor of heavy metal and sulfur pollution generated by metal smelters of NW Russia. Warming and its interaction with heavy metal dynamics will influence meadow productivity, species composition and GHG emissions, as mediated by responses of soil microbial communities. Adaptation and mitigation measurements will be needed. Biochar application, which immobilizes heavy metal, is a promising adaptation method to promote positive growth response in arctic meadows exposed to a warming climate. In the MeadoWarm project we conduct an ecosystem warming experiment combined to biochar adaptation treatments in the heavy-metal polluted meadows of Eastern Finnmark. In summary, the general objective of this study is twofold: 1) to determine the response of arctic agricultural ecosystems under environmental stress to increased temperatures, both in terms of plant growth, soil organisms and GHG emissions, and 2) to determine if biochar application can serve as a positive adaptation (plant growth) and mitigation (GHG emission) strategy for these ecosystems under warming conditions. Here, we present the experimental site and the designed open-field warming facility. The selected site is an arctic meadow located at the Svanhovd Research station less than 10km west from the Russian mining city of Nikel. A splitplot design with 5 replicates for each treatment is used to test the effect of biochar amendment and a 3oC warming on the Arctic meadow. Ten circular

  19. Effect of non-normality on test statistics for one-way independent groups designs.

    Science.gov (United States)

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  20. Statistics of Scientific Procedures on Living Animals 2011: another increase in experimentation, but is there a shift in emphasis?

    Science.gov (United States)

    Hudson-Shore, Michelle

    2012-09-01

    The 2011 Statistics of Scientific Procedures on Living Animals reveal that the level of animal experimentation in Great Britain continues to rise, with almost 3.8 million procedures being conducted. Unlike those in previous years, this increase is not exclusively due to the breeding and utilisation of genetically altered animals, although they are still involved in the greatest proportion of procedures. That a shift toward fundamental research may have become the primary cause of increases in animal experiments is discussed. The general trends in the species used, and the numbers and types of procedures, are reviewed. In addition, some areas of concern and optimism are outlined. 2012 FRAME.

  1. Fast Synthesis of Gibbsite Nanoplates and Process Optimization using Box-Behnken Experimental Design

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xin; Zhang, Xianwen; Graham, Trenton R.; Pearce, Carolyn I.; Mehdi, Beata L.; N' Diaye, Alpha T.; Kerisit, Sebastien N.; Browning, Nigel D.; Clark, Sue B.; Rosso, Kevin M.

    2017-11-13

    Developing the ability to synthesize compositionally and morphologically well-defined gibbsite particles at the nanoscale with high yield is an ongoing need that has not yet achieved the level of rational design. Here we report optimization of a clean inorganic synthesis route based on statistical experimental design examining the influence of Al(OH)3 gel precursor concentration, pH, and aging time at temperature. At 80 oC, the optimum synthesis conditions of gel concentration at 0.5 M, pH at 9.2, and time at 72 h maximized the reaction yield up to ~87%. The resulting gibbsite product is composed of highly uniform euhedral hexagonal nanoplates within a basal plane diameter range of 200-400 nm. The independent roles of key system variables in the growth mechanism are considered. On the basis of these optimized experimental conditions, the synthesis procedure, which is both cost-effective and environmentally friendly, has the potential for mass production scale-up of high quality gibbsite material for various fundamental research and industrial applications.

  2. Development of a fast, lean and agile direct pelletization process using experimental design techniques.

    Science.gov (United States)

    Politis, Stavros N; Rekkas, Dimitrios M

    2017-04-01

    A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.

  3. Fungal mediated silver nanoparticle synthesis using robust experimental design and its application in cotton fabric

    Science.gov (United States)

    Velhal, Sulbha Girish; Kulkarni, S. D.; Latpate, R. V.

    2016-09-01

    Among the different methods employed for the synthesis of nanoparticles, the biological method is most favorable and quite well established. In microorganisms, use of fungi in the biosynthesis of silver nanoparticles has a greater advantage over other microbial mediators. In this study, intracellular synthesis of silver nanoparticles from Aspergillus terrerus (Thom) MTCC632 was carried out. We observed that synthesis of silver nanoparticles depended on factors such as temperature, amount of biomass and concentration of silver ions in the reaction mixture. Hence, optimization of biosynthesis using these parameters was carried out using statistical tool `robust experimental design'. Size and morphology of synthesized nanoparticles were determined using X-ray diffraction technique, field emission scanning electron microscopy, energy dispersion spectroscopy, and transmission electron microscopy. Nano-embedded cotton fabric was further prepared and studied for its antibacterial properties.

  4. Plant growth modeling at the JSC variable pressure growth chamber - An application of experimental design

    Science.gov (United States)

    Miller, Adam M.; Edeen, Marybeth; Sirko, Robert J.

    1992-01-01

    This paper describes the approach and results of an effort to characterize plant growth under various environmental conditions at the Johnson Space Center variable pressure growth chamber. Using a field of applied mathematics and statistics known as design of experiments (DOE), we developed a test plan for varying environmental parameters during a lettuce growth experiment. The test plan was developed using a Box-Behnken approach to DOE. As a result of the experimental runs, we have developed empirical models of both the transpiration process and carbon dioxide assimilation for Waldman's Green lettuce over specified ranges of environmental parameters including carbon dioxide concentration, light intensity, dew-point temperature, and air velocity. This model also predicts transpiration and carbon dioxide assimilation for different ages of the plant canopy.

  5. Experimental design and process analysis for acidic leaching of metal-rich glass wastes.

    Science.gov (United States)

    Tuncuk, A; Ciftci, H; Akcil, A; Ognyanova, A; Vegliò, F

    2010-05-01

    The removal of iron, titanium and aluminium from colourless and green waste glasses has been studied under various experimental conditions in order to optimize the process parameters and to decrease the metal content in the waste glass by acidic leaching. Statistical design of experiments and ANOVA (analysis of variance) were performed in order to determine the main effects and interactions between the investigated factors (sample ratio, acid concentration, temperature and leaching time). A full factorial experiment was performed by sulphuric acid leaching of glass for metal removal. After treating, the iron content was 530 ppm, corresponding to 1880 ppm initial concentration of Fe(2)O(3) in the original colourless sample. This result is achieved using 1M H(2)SO( 4) and 30% sample ratio at 90(o)C leaching temperature for 2 hours. The iron content in the green waste glass sample was reduced from 3350 ppm initial concentration to 2470 ppm after treating.

  6. Experimental Verification of Statistically Optimized Parameters for Low-Pressure Cold Spray Coating of Titanium

    Directory of Open Access Journals (Sweden)

    Damilola Isaac Adebiyi

    2016-06-01

    Full Text Available The cold spray coating process involves many process parameters which make the process very complex, and highly dependent and sensitive to small changes in these parameters. This results in a small operational window of the parameters. Consequently, mathematical optimization of the process parameters is key, not only to achieving deposition but also improving the coating quality. This study focuses on the mathematical identification and experimental justification of the optimum process parameters for cold spray coating of titanium alloy with silicon carbide (SiC. The continuity, momentum and the energy equations governing the flow through the low-pressure cold spray nozzle were solved by introducing a constitutive equation to close the system. This was used to calculate the critical velocity for the deposition of SiC. In order to determine the input temperature that yields the calculated velocity, the distribution of velocity, temperature, and pressure in the cold spray nozzle were analyzed, and the exit values were predicted using the meshing tool of Solidworks. Coatings fabricated using the optimized parameters and some non-optimized parameters are compared. The coating of the CFD-optimized parameters yielded lower porosity and higher hardness.

  7. Multivariate Statistical Analysis of Cigarette Design Feature Influence on ISO TNCO Yields.

    Science.gov (United States)

    Agnew-Heard, Kimberly A; Lancaster, Vicki A; Bravo, Roberto; Watson, Clifford; Walters, Matthew J; Holman, Matthew R

    2016-06-20

    The aim of this study is to explore how differences in cigarette physical design parameters influence tar, nicotine, and carbon monoxide (TNCO) yields in mainstream smoke (MSS) using the International Organization of Standardization (ISO) smoking regimen. Standardized smoking methods were used to evaluate 50 U.S. domestic brand cigarettes and a reference cigarette representing a range of TNCO yields in MSS collected from linear smoking machines using a nonintense smoking regimen. Multivariate statistical methods were used to form clusters of cigarettes based on their ISO TNCO yields and then to explore the relationship between the ISO generated TNCO yields and the nine cigarette physical design parameters between and within each cluster simultaneously. The ISO generated TNCO yields in MSS are 1.1-17.0 mg tar/cigarette, 0.1-2.2 mg nicotine/cigarette, and 1.6-17.3 mg CO/cigarette. Cluster analysis divided the 51 cigarettes into five discrete clusters based on their ISO TNCO yields. No one physical parameter dominated across all clusters. Predicting ISO machine generated TNCO yields based on these nine physical design parameters is complex due to the correlation among and between the nine physical design parameters and TNCO yields. From these analyses, it is estimated that approximately 20% of the variability in the ISO generated TNCO yields comes from other parameters (e.g., filter material, filter type, inclusion of expanded or reconstituted tobacco, and tobacco blend composition, along with differences in tobacco leaf origin and stalk positions and added ingredients). A future article will examine the influence of these physical design parameters on TNCO yields under a Canadian Intense (CI) smoking regimen. Together, these papers will provide a more robust picture of the design features that contribute to TNCO exposure across the range of real world smoking patterns.

  8. A Short Guide to Experimental Design and Analysis for Engineers

    Science.gov (United States)

    2014-04-01

    7 1.1.1 Radar / Spider Chart...UNCLASSIFIED 7 evaluation of possibly complex behaviours and attitudes include the checklist and the rating scale. A checklist is a list of behaviours...results through descriptive statistics is through visualisation. This section surveys some notable approaches including, 1. Radar / Spider Chart 2

  9. Efficient Experimental Design Strategies in Toxicology and Bioassay

    Directory of Open Access Journals (Sweden)

    Timothy E. O'Brien

    2016-06-01

    Full Text Available Modelling in bioassay often uses linear or nonlinear logistic regression models, and relative potency is often the focus when two or more compounds are to be compared.  Estimation in these settings is typically based on likelihood methods.  Here, we focus on the 3-parameter model representation given in Finney (1978 in which the relative potency is a model parameter.  Using key matrix results and the general equivalence theorem of Kiefer & Wolfowitz (1960, this paper establishes key design properties of the optimal design for relative potency using this model.  We also highlight aspects of subset designs for the relative potency parameter and extend geometric designs to efficient design settings of bioassay.  These latter designs are thus useful for both parameter estimation and checking for goodness-of-fit.  A typical yet insightful example is provided from the field of toxicology to illustrate our findings.

  10. Planejamento experimental estatístico para a otimização das condições em batelada de dessorção de níquel da alga marinha Sargassum filipendula = Statistical design of experiments for optimizing the batch conditions to nickel desorption on Sargassum filipendula seaweed

    Directory of Open Access Journals (Sweden)

    Araceli Aparecida Seolatto

    2009-04-01

    Full Text Available A fim de se realizar um estudo sobre os efeitos de parâmetros importantes na dessorção de níquel da alga marinha Sargassum filipendula, quatro agentes foram testados em grupos de planejamento fatorial 4.23 (tipo de eluente (E, carga de níquel (q, concentraçãoinicial do eluente(C e razão sólido-líquido(R. Duas respostas foram analisadas: a quantidade dessorvida e a perda de massa durante o processo de dessorção. Os resultados mostraram que todos os fatores foram significativos nas duas respostas. Além disso, váriasinterações entre os fatores se mostraram significantes. Analisando a contribuição de cada fator, concluiu-se que as melhores condições operacionais foram obtidas na configuração para os eluentes H2SO4/MgSO4, na maior concentração testada, maior carga de níquel emenor razão sólido-líquido. Nestas condições, a eficiência na eluição da biomassa mostrouse superior a 95%, o que torna o processo de biossorção ainda mais interessante no tratamento de soluções que contêm níquel. A biomassa Sargassum filipendula, depois deregenerada, pode ser utilizada em ciclos subsequentes de biossorção, a fim de diminuir o custo com o biossorvente e facilitar o processo de tratamento que dispensa trocas sucessivas da biomassa.In order to conduct a study on the effect of important design parameters in nickel desorption from the macroalgae Sargassum filipendula, four agents were tested in sets of full 4.23 factorial designs (eluant agent, nickel load, initial acid concentration and solid:liquid ratio. Two responses were analyzed: the amount desorbed for the agents and the weight loss during the desorption process. The results showed that all factors were significant in both responses.Additionally, several interactions among the factors were also significant. Analyzing the contribution of each factor, it was verified that the best operational conditions were obtained for eluents H2SO4/MgSO4, at the highest tested

  11. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    Science.gov (United States)

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are

  12. FFTF reload core nuclear design for increased experimental capability

    Energy Technology Data Exchange (ETDEWEB)

    Rothrock, R.B.; Nelson, J.V.; Dobbin, K.D.; Bennett, R.A.

    1976-01-01

    In anticipation of continued growth in the FTR experimental irradiations program, the enrichments for the next batches of reload driver fuel to be manufactured have been increased to provide a substantially enlarged experimental reactivity allowance. The enrichments for these fuel assemblies, termed ''Cores 3 and 4,'' were selected to meet the following objectives and constraints: (1) maintain a reactor power capability of 400 MW (based on an evaluation of driver fuel centerline melting probability at 15 percent overpower); (2) provide a peak neutron flux of nominally 7 x 10/sup 15/ n/cm/sup 2/-sec, with a minimum acceptable value of 95 percent of this (i.e., 6.65 x 10/sup 15/ n/cm/sup 2/-sec); and (3) provide the maximum experimental reactivity allowance that is consistent with the above constraints.

  13. Critical Zone Experimental Design to Assess Soil Processes and Function

    Science.gov (United States)

    Banwart, Steve

    2010-05-01

    experimental design studies soil processes across the temporal evolution of the soil profile, from its formation on bare bedrock, through managed use as productive land to its degradation under longstanding pressures from intensive land use. To understand this conceptual life cycle of soil, we have selected 4 European field sites as Critical Zone Observatories. These are to provide data sets of soil parameters, processes and functions which will be incorporated into the mathematical models. The field sites are 1) the BigLink field station which is located in the chronosequence of the Damma Glacier forefield in alpine Switzerland and is established to study the initial stages of soil development on bedrock; 2) the Lysina Catchment in the Czech Republic which is representative of productive soils managed for intensive forestry, 3) the Fuchsenbigl Field Station in Austria which is an agricultural research site that is representative of productive soils managed as arable land and 4) the Koiliaris Catchment in Crete, Greece which represents degraded Mediterranean region soils, heavily impacted by centuries of intensive grazing and farming, under severe risk of desertification.

  14. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    , sample extraction, and analytical methods to be used in the INL-2 study. For each of the five test events, the specified floor of the INL building will be contaminated with BG using a point-release device located in the room specified in the experimental design. Then quality control (QC), reference material coupon (RMC), judgmental, and probabilistic samples will be collected according to the sampling plan for each test event. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples were selected with a random aspect and in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated. For possibly contaminated areas, the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using the CJR approach. The numbers of samples were chosen to support making X%/Y% clearance statements with X = 95% or 99% and Y = 96% or 97%. The experimental and sampling design also provides for making X%/Y% clearance statements using only probabilistic samples. For each test event, the numbers of characterization and clearance samples were selected within limits based on operational considerations while still maintaining high confidence for detection and clearance aspects. The sampling design for all five test events contains 2085 samples, with 1142 after contamination and 943 after decontamination. These numbers include QC, RMC, judgmental, and probabilistic samples. The experimental and sampling design specified in this report provides a good statistical foundation for achieving the objectives of the INL-2 study.

  15. Findings in Experimental Psychology as Functioning Principles of Theatrical Design.

    Science.gov (United States)

    Caldwell, George

    A gestalt approach to theatrical design seems to provide some ready and stable explanations for a number of issues in the scenic arts. Gestalt serves as the theoretical base for a number of experiments in psychology whose findings appear to delineate the principles of art to be used in scene design. The fundamental notion of gestalt theory…

  16. The experimental design of the Missouri Ozark Forest Ecosystem Project

    Science.gov (United States)

    Steven L. Sheriff; Shuoqiong. He

    1997-01-01

    The Missouri Ozark Forest Ecosystem Project (MOFEP) is an experiment that examines the effects of three forest management practices on the forest community. MOFEP is designed as a randomized complete block design using nine sites divided into three blocks. Treatments of uneven-aged, even-aged, and no-harvest management were randomly assigned to sites within each block...

  17. Statistics of Scientific Procedures on Living Animals 2012: another increase in experimentation - genetically-altered animals dominate again.

    Science.gov (United States)

    Hudson-Shore, Michelle

    2013-09-01

    The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2012 reveal that the level of animal experimentation in Great Britain continues to rise, with just over 4.1 million procedures being started in that year. Despite the previous year's indication that the dominance of the production and use of genetically-altered (GA, i.e. genetically-modified animals plus animals with harmful genetic defects) animal might be abating, it returned with a vengeance in 2012. Breeding increased from 43% to 48% of all procedures, and GA animals were involved in 59% of all the procedures. Indeed, if the breeding of these animals were removed from the statistics, the total number of procedures would actually decline by 2%. In order to honour their pledge to reduce animal use in science, the Coalition Government will have to address this issue. The general trends in the species used, and the numbers and types of procedures, are also reviewed. Finally, forthcoming changes to the statistics are discussed. 2013 FRAME.

  18. Leveraging the Experimental Method to Inform Solar Cell Design

    Science.gov (United States)

    Rose, Mary Annette; Ribblett, Jason W.; Hershberger, Heather Nicole

    2010-01-01

    In this article, the underlying logic of experimentation is exemplified within the context of a photoelectrical experiment for students taking a high school engineering, technology, or chemistry class. Students assume the role of photochemists as they plan, fabricate, and experiment with a solar cell made of copper and an aqueous solution of…

  19. Global benefit-risk assessment in designing clinical trials and some statistical considerations of the method.

    Science.gov (United States)

    Pritchett, Yili Lu; Tamura, Roy

    2008-01-01

    When characterizing a therapy, the efficacy and the safety are two major aspects under consideration. In prescribing a therapy to a patient, a clinician puts the two aspects together and makes a decision based on a consolidated thought process. The global benefit-risk (GBR) measures proposed by Chuang-Stein et al. (Stat. Med. 1991; 10:1349-1359) are useful in facilitating the thinking, and creating the framework for making statistical comparisons based on benefit-risk point of view. This article describes how a GBR linear score was defined and used as the primary outcome measure in a clinical trial design. The robustness of the definitions of 'benefit' and 'risk' are evaluated using different criteria. The sensitivity of the pre-specified weights is also analyzed using alternative weights; one of those was determined by the relative to an identified distribution integral transformation approach (Biometrics 1958; 14:18-38). Statistical considerations are illustrated using pooled data from clinical trials studying antidepressant. The pros and cons for using GBR assessments in the setting of clinical trials are discussed.

  20. Optimization and evaluation of clarithromycin floating tablets using experimental mixture design.

    Science.gov (United States)

    Uğurlu, Timucin; Karaçiçek, Uğur; Rayaman, Erkan

    2014-01-01

    The purpose of the study was to prepare and evaluate clarithromycin (CLA) floating tablets using experimental mixture design for treatment of Helicobacter pylori provided by prolonged gastric residence time and controlled plasma level. Ten different formulations were generated based on different molecular weight of hypromellose (HPMC K100, K4M, K15M) by using simplex lattice design (a sub-class of mixture design) with Minitab 16 software. Sodium bicarbonate and anhydrous citric acid were used as gas generating agents. Tablets were prepared by wet granulation technique. All of the process variables were fixed. Results of cumulative drug release at 8th h (CDR 8th) were statistically analyzed to get optimized formulation (OF). Optimized formulation, which gave floating lag time lower than 15 s and total floating time more than 10 h, was analyzed and compared with target for CDR 8th (80%). A good agreement was shown between predicted and actual values of CDR 8th with a variation lower than 1%. The activity of clarithromycin contained optimizedformula against H. pylori were quantified using well diffusion agar assay. Diameters of inhibition zones vs. log10 clarithromycin concentrations were plotted in order to obtain a standard curve and clarithromycin activity.

  1. Statistically designed optimisation of enzyme catalysed starch removal from potato pulp

    DEFF Research Database (Denmark)

    Thomassen, Lise Vestergaard; Meyer, Anne S.

    2010-01-01

    to obtain dietary fibers is usually accomplished via a three step, sequential enzymatic treatment procedure using a heat stable alpha-amylase, protease, and amyloglucosidase. Statistically designed experiments were performed to investigate the influence of enzyme dose, amount of dry matter, incubation time...... of this study was to release the residual starch, making up 21-22% by weight of the dry matter, from the potato pulp in a rational way employing as few steps, as few enzyme activities, as low enzyme dosages, as low energy input (temperature and time), and as high pulp dry matter as possible. Starch removal...... and temperature on the amount of starch released from the potato pulp. The data demonstrated that all the starch could be released from potato pulp in one step when 8% (w/w) dry potato pulp was treated with 0.2% (v/w) (enzyme/substrate (E/S)) of a thermostable Bacillus licheniformis alpha-amylase (Termamyl(R) SC...

  2. Methotrexatum intercalated layered double hydroxides: Statistical design, mechanism explore and bioassay study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiao-Feng [Department of Gastroenterology, Weihai municipal hospital, Weihai 264200 (China); Liu, Su-Qing [Jiangsu Key Laboratory of Biofunctional Material, College of Chemistry and Material Science, Nanjing Normal University, Nanjing 210023 (China); Li, Shu-Ping, E-mail: lishuping@njnu.edu.cn [Jiangsu Key Laboratory of Biofunctional Material, College of Chemistry and Material Science, Nanjing Normal University, Nanjing 210023 (China)

    2015-04-01

    A series of methotrexatum intercalated layered double hydroxide (MTX/LDH for short) hybrids have been synthesized by a mechanochemical–hydrothermal method, the statistical experiments are planned and conducted to find out the critical factor influencing the physicochemical properties. Four variables, i.e., addition of NaOH solution, grinding duration, hydrothermal temperature and time, are chosen to play as the examined factors in the orthogonal design. Furthermore, three respective levels, i.e., high, medium and low levels, are conducted in the design. The resulting hybrids are then characterized by X-ray diffraction (XRD) patterns, transmission electron microscope (TEM) graphs and Zeta potentials. XRD diffractions indicate that MTX anions have been successfully intercalated into LDH interlayers and the amount of NaOH solution can change the gallery height greatly. The information from TEM graphs and Zeta potentials state that the increase of alkali solution gives rise to regular morphology and the increase of Zeta potentials. As a result of the statistical analysis, addition of alkali solution is the major factor affecting the morphology and drug-loading capacity. At last, the mechanism of particle growth is explored emphatically, and the anticancer efficacy of some MTX/LDH hybrids is estimated by MTT assay on A549 cells as well. - Graphical abstract: Schematic illustration of synthesis and properties of MTX intercalated LDH hybrids. - Highlights: • Increasing NaOH solution gives rise to high drug-loading capacity. • Increasing the alkali solution leads to high layer charge and regular morphology. • The monodispersity has critical effect on the tumor suppression efficiency.

  3. Formulation and optimization of chronomodulated press-coated tablet of carvedilol by Box–Behnken statistical design

    Directory of Open Access Journals (Sweden)

    Satwara RS

    2012-08-01

    Full Text Available Rohan S Satwara, Parul K PatelDepartment of Pharmaceutics, Babaria Institute of Pharmacy, Vadodara, Gujarat, IndiaObjective: The primary objective of the present investigation was to formulate and optimize chronomodulated press-coated tablets to deliver the antihypertensive carvedilol at an effective quantity predawn, when a blood pressure spike is typically observed in most hypertensive patients.Experimental work: Preformulation studies and drug excipient compatibility studies were carried out for carvedilol and excipients. Core tablets (6 mm containing carvedilol and 10-mm press-coated tablets were prepared by direct compression. The Box–Behnken experimental design was applied to these press-coated tablets (F1–F15 formula with differing concentrations of rate-controlling polymers. Hydroxypropyl methyl cellulose K4M, ethyl cellulose, and K-carrageenan were used as rate-controlling polymers in the outer layer. These tablets were subjected to various precompression and postcompression tests. The optimized batch was derived both by statistically (using desirability function and graphically (using Design Expert® 8; Stat-Ease Inc. Tablets formulated using the optimized formulas were then evaluated for lag time and in vitro dissolution.Results and discussion: Results of preformulation studies were satisfactory. No interaction was observed between carvedilol and excipients by ultraviolet, Fourier transform infrared spectroscopy, and dynamic light scattering analysis. The results of precompression studies and postcompression studies were within limits. The varying lag time and percent cumulative carvedilol release after 8 h was optimized to obtain a formulation that offered a release profile with 6 h lag time, followed by complete carvedilol release after 8 h. The results showed no significant bias between predicted response and actual response for the optimized formula.Conclusion: Bedtime dosing of chronomodulated press-coated tablets may offer a

  4. Optimization of Pb(II) biosorption by Robinia tree leaves using statistical design of experiments.

    Science.gov (United States)

    Zolgharnein, Javad; Shahmoradi, Ali; Sangi, Mohammad Reza

    2008-07-30

    The present study introduces Robinia tree leaves as a novel and efficient biosorbent for removing Pb(II) from aqueous solutions. In order to reduce the large number of experiments and find the highest removal efficiency of Pb(II), a set of full 2(3) factorial design with two blocks were performed in duplicate (16 experiments). In all experiments, the contact time was fixed at 25 min. The main interaction effects of the three factors including sorbent mass, pH and initial concentration of metal-ion were considered. By using Student's t-test and analysis of variances (ANOVA), the main factors, which had the highest effect on the removal process, were identified. Twenty-six experiments were designed according to Doehlert response surface design to obtain a mathematical model describing functional relationship between response and main independent variables. The most suitable regression model, that fitted the experimental data extremely well, was chosen according to the lack-of-fit-test and adjusted R(2) value. Finally, after checking for possible outliers, the optimum conditions for maximum removal of Pb(II) from aqueous solution were obtained. The best conditions were calculated to be as: initial concentration of Pb(II)=40 mg L(-1), pH 4.6 and concentration of sorbet equal to 27.3 g L(-1).

  5. Experimental design with applications in management, engineering and the sciences

    CERN Document Server

    Berger, Paul D; Celli, Giovana B

    2018-01-01

    This text introduces and provides instruction on the design and analysis of experiments for a broad audience. Formed by decades of teaching, consulting, and industrial experience in the Design of Experiments field, this new edition contains updated examples, exercises, and situations covering the science and engineering practice. This text minimizes the amount of mathematical detail, while still doing full justice to the mathematical rigor of the presentation and the precision of statements, making the text accessible for those who have little experience with design of experiments and who need some practical advice on using such designs to solve day-to-day problems. Additionally, an intuitive understanding of the principles is always emphasized, with helpful hints throughout.

  6. Interdisciplinary for Social Engagement: Art and Design Experimental Study

    OpenAIRE

    Hope Angelique Wells; Lyubava Fartushenko; Andrew Chamberlain

    2015-01-01

    In this ever-changing world with a population of over seven billion, social equality, and environmental sustainability needs new innovative and inclusive solutions. Today, the interdisciplinary practice of art and design has the ability to raise social awareness and engagement in health-related issues. Art and design experiments with combining the traditional art methods with technology and interactive systems to communicate scientific research with problem solving goals. Creative interdiscip...

  7. Statistical design of personalized medicine interventions: The Clarification of Optimal Anticoagulation through Genetics (COAG trial

    Directory of Open Access Journals (Sweden)

    Gage Brian F

    2010-11-01

    Full Text Available Abstract Background There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants. Methods The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in CYP2C9 and VKORC1; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information. Results We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either CYP2C9 or VKORC1 and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these

  8. Experimental design for optimization of microwave-assisted extraction of benzodiazepines in human plasma.

    Science.gov (United States)

    Fernández, P; Vázquez, C; Lorenzo, R A; Carro, A M; Alvarez, I; Cabarcos, P

    2010-05-01

    A simple and fast microwave-assisted-extraction (MAE) method has been evaluated as an alternative to solid-phase extraction (SPE) for the determination of six benzodiazepines widely prescribed in European countries (alprazolam, bromazepam, diazepam, lorazepam, lormetazepam and tetrazepam) in human plasma. For MAE optimization a Doehlert experimental design was used with extraction time, temperature and solvent volume as influential parameters. A desirability function was employed in addition to the simultaneous optimization of the MAE conditions. The analysis of variance showed that the solvent volume had a positive influence on the extraction of all the analytes tested, achieving a statistically significant effect. Also, the extraction time had a statistically significant effect on the extraction of four benzodiazepines. The selected MAE conditions-89 degrees C, 13 min and 8 mL of chloroform/2-propanol (4:1, v/v)-led to recoveries between 89.8 +/- 0.3 and 102.1 +/- 5.2% for benzodiazepines using a high performance liquid chromatography method coupled with diode-array detection. The comparison of MAE and SPE shows better results for MAE, with a lower number of steps in handling the sample and greater efficiency. The applicability of MAE was successfully tested in 27 plasma samples from benzodiazepine users.

  9. Experimental Evaluation of Three Designs of Electrodynamic Flexural Transducers

    Directory of Open Access Journals (Sweden)

    Tobias J. R. Eriksson

    2016-08-01

    Full Text Available Three designs for electrodynamic flexural transducers (EDFT for air-coupled ultrasonics are presented and compared. An all-metal housing was used for robustness, which makes the designs more suitable for industrial applications. The housing is designed such that there is a thin metal plate at the front, with a fundamental flexural vibration mode at ∼50 kHz. By using a flexural resonance mode, good coupling to the load medium was achieved without the use of matching layers. The front radiating plate is actuated electrodynamically by a spiral coil inside the transducer, which produces an induced magnetic field when an AC current is applied to it. The transducers operate without the use of piezoelectric materials, which can simplify manufacturing and prolong the lifetime of the transducers, as well as open up possibilities for high-temperature applications. The results show that different designs perform best for the generation and reception of ultrasound. All three designs produced large acoustic pressure outputs, with a recorded sound pressure level (SPL above 120 dB at a 40 cm distance from the highest output transducer. The sensitivity of the transducers was low, however, with single shot signal-to-noise ratio ( SNR ≃ 15 dB in transmit–receive mode, with transmitter and receiver 40 cm apart.

  10. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    Directory of Open Access Journals (Sweden)

    Marco Aldinucci

    2014-01-01

    Full Text Available The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  11. Robust transceiver design for reciprocal M × N interference channel based on statistical linearization approximation

    Science.gov (United States)

    Mayvan, Ali D.; Aghaeinia, Hassan; Kazemi, Mohammad

    2017-12-01

    This paper focuses on robust transceiver design for throughput enhancement on the interference channel (IC), under imperfect channel state information (CSI). In this paper, two algorithms are proposed to improve the throughput of the multi-input multi-output (MIMO) IC. Each transmitter and receiver has, respectively, M and N antennas and IC operates in a time division duplex mode. In the first proposed algorithm, each transceiver adjusts its filter to maximize the expected value of signal-to-interference-plus-noise ratio (SINR). On the other hand, the second algorithm tries to minimize the variances of the SINRs to hedge against the variability due to CSI error. Taylor expansion is exploited to approximate the effect of CSI imperfection on mean and variance. The proposed robust algorithms utilize the reciprocity of wireless networks to optimize the estimated statistical properties in two different working modes. Monte Carlo simulations are employed to investigate sum rate performance of the proposed algorithms and the advantage of incorporating variation minimization into the transceiver design.

  12. An in silico approach helped to identify the best experimental design, population, and outcome for future randomized clinical trials.

    Science.gov (United States)

    Bajard, Agathe; Chabaud, Sylvie; Cornu, Catherine; Castellan, Anne-Charlotte; Malik, Salma; Kurbatova, Polina; Volpert, Vitaly; Eymard, Nathalie; Kassai, Behrouz; Nony, Patrice

    2016-01-01

    The main objective of our work was to compare different randomized clinical trial (RCT) experimental designs in terms of power, accuracy of the estimation of treatment effect, and number of patients receiving active treatment using in silico simulations. A virtual population of patients was simulated and randomized in potential clinical trials. Treatment effect was modeled using a dose-effect relation for quantitative or qualitative outcomes. Different experimental designs were considered, and performances between designs were compared. One thousand clinical trials were simulated for each design based on an example of modeled disease. According to simulation results, the number of patients needed to reach 80% power was 50 for crossover, 60 for parallel or randomized withdrawal, 65 for drop the loser (DL), and 70 for early escape or play the winner (PW). For a given sample size, each design had its own advantage: low duration (parallel, early escape), high statistical power and precision (crossover), and higher number of patients receiving the active treatment (PW and DL). Our approach can help to identify the best experimental design, population, and outcome for future RCTs. This may be particularly useful for drug development in rare diseases, theragnostic approaches, or personalized medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Box-Behnken statistical design to optimize thermal performance of energy storage systems

    Science.gov (United States)

    Jalalian, Iman Joz; Mohammadiun, Mohammad; Moqadam, Hamid Hashemi; Mohammadiun, Hamid

    2017-11-01

    Latent heat thermal storage (LHTS) is a technology that can help to reduce energy consumption for cooling applications, where the cold is stored in phase change materials (PCMs). In the present study a comprehensive theoretical and experimental investigation is performed on a LHTES system containing RT25 as phase change material (PCM). Process optimization of the experimental conditions (inlet air temperature and velocity and number of slabs) was carried out by means of Box-Behnken design (BBD) of Response surface methodology (RSM). Two parameters (cooling time and COP value) were chosen to be the responses. Both of the responses were significantly influenced by combined effect of inlet air temperature with velocity and number of slabs. Simultaneous optimization was performed on the basis of the desirability function to determine the optimal conditions for the cooling time and COP value. Maximum cooling time (186 min) and COP value (6.04) were found at optimum process conditions i.e. inlet temperature of (32.5), air velocity of (1.98) and slab number of (7).

  14. Modeling and optimization of Electrical Discharge Machining (EDM using statistical design

    Directory of Open Access Journals (Sweden)

    Hegab Husein A.

    2015-01-01

    Full Text Available Modeling and optimization of nontraditional machining is still an ongoing area of research. The objective of this work is to optimize Electrical Discharge Machining process parameters of Aluminum-multiwall carbon Nanotube composites (AL-CNT model. Material Removal Rate (MRR, Wear Electrode Ratio (EWR and Average Surface Roughness (Ra are primary objectives. The Machining parameters are machining-on time (sec, discharge current (A, voltage (V, total depth of cut (mm, and %wt. CNT added. Mathematical models for all responses as function of significant process parameters are developed using Response Surface Methodology (RSM. Experimental results show optimum levels for material removal rate are %wt. CNT (0%, high level of discharge current (6A and low level of voltage (50 V while optimum levels for Electrode wear ratio are %wt. CNT (5%, high level of discharge current (6A and optimum levels for average surface roughness are %wt. CNT (0%, low level of discharge current (2A and high level of depth of cut (1 mm. Single-objective optimization is formulated and solved via Genetic Algorithm. Multi-objective optimization model is then formulated for the three responses of interest. This methodology gathers experimental results, builds mathematical models in the domain of interest and optimizes the process models. As such, process analysis, modeling, design and optimization are achieved.

  15. Experimental design and statistical rigor in phylogenomics of horizontal and endosymbiotic gene transfer

    OpenAIRE

    Stiller John W

    2011-01-01

    Abstract A growing number of phylogenomic investigations from diverse eukaryotes are examining conflicts among gene trees as evidence of horizontal gene transfer. If multiple foreign genes from the same eukaryotic lineage are found in a given genome, it is increasingly interpreted as concerted gene transfers during a cryptic endosymbiosis in the organism's evolutionary past, also known as "endosymbiotic gene transfer" or EGT. A number of provocative hypotheses of lost or serially replaced end...

  16. An Experimental Verification of morphology of ibuprofen crystals from CAMD designed solvent

    DEFF Research Database (Denmark)

    Karunanithi, Arunprakash T.; Acquah, Charles; Achenie, Luke E.K.

    2007-01-01

    of crystals formed from solvents, necessitates additional experimental verification steps. In this work we report the experimental verification of crystal morphology for the case study, solvent design for ibuprofen crystallization, presented in Karunanithi et al. [2006. A computer-aided molecular design...

  17. Web-Based Learning Support for Experimental Design in Molecular Biology: A Top-Down Approach

    Science.gov (United States)

    Aegerter-Wilmsen, Tinri; Hartog, Rob; Bisseling, Ton

    2003-01-01

    An important learning goal of a molecular biology curriculum is the attainment of a certain competence level in experimental design. Currently, undergraduate students are confronted with experimental approaches in textbooks, lectures and laboratory courses. However, most students do not reach a satisfactory level of competence in the designing of…

  18. Interdisciplinary for Social Engagement: Art and Design Experimental Study

    Directory of Open Access Journals (Sweden)

    Hope Angelique Wells

    2015-12-01

    Full Text Available In this ever-changing world with a population of over seven billion, social equality, and environmental sustainability needs new innovative and inclusive solutions. Today, the interdisciplinary practice of art and design has the ability to raise social awareness and engagement in health-related issues. Art and design experiments with combining the traditional art methods with technology and interactive systems to communicate scientific research with problem solving goals. Creative interdisciplinary opens many doors to a variety of collaborative efforts in seeking proactive solutions to environmental and social issues.

  19. From cultural to environmental heritage. Design experimentations in ancient settlement

    Directory of Open Access Journals (Sweden)

    Antonello Monsù Scolaro

    2016-11-01

    Full Text Available The rising awareness of human impacts on the natural environment compels researchers to steadily review the sustainability criteria for the built environment. This evolution clearly influences the research and design experiences reported in this paper, which focus on the reclamation and reuse of the historic urban fabric. These experiments help to imagine a possible future where innovation happens into the core of the city, usually considered a place devoted to tradition, rather than in its outskirts. Unexpected environmental issues consequently come to the fore, that could help in reconsidering prevailing design and building practices as well planning policies.

  20. A unifying experimental design for dissecting tree genomes.

    Science.gov (United States)

    Sun, Lidan; Zhu, Xuli; Zhang, Qixiang; Wu, Rongling

    2015-08-01

    Linkage mapping and association mapping are adopted as an approach of choice for dissecting complex traits, but each shows a limitation when used alone. We propose an open-pollinated (OP) family design to integrate these two approaches into an organizing framework. The design unifies the strengths of population and quantitative genetic studies for evolutionary inference and high-resolution gene mapping. It particularly suits genome dissection of forest trees given their extant populations that are mostly undomesticated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Experimental design applied to the optimization and partial ...

    African Journals Online (AJOL)

    The objective of this work was to optimize the medium composition for maximum pectin-methylesterase (PME) production from a newly isolated strain of Penicillium brasilianum by submerged fermentation. A Plackett-Burman design was first used for the screening of most important factors, followed by a 23 full ...

  2. Introduction to Experimental Design: Can You Smell Fear?

    Science.gov (United States)

    Willmott, Chris J. R.

    2011-01-01

    The ability to design appropriate experiments in order to interrogate a research question is an important skill for any scientist. The present article describes an interactive lecture-based activity centred around a comparison of two contrasting approaches to investigation of the question "Can you smell fear?" A poorly designed…

  3. Design and implementation of a low cost experimental testbed for ...

    African Journals Online (AJOL)

    Th is study therefore presents an inexpensive WSN test bed designed and constructed from an ESP8266EX Wi - Fi module. An experiment was conducted and results revealed that ESP8266EX based sensor nodes have a wider network coverage compared to the Arduino sensor s based test bed. The proposed test bed ...

  4. Tokamak experimental power reactor conceptual design. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    1976-08-01

    Volume II contains the following appendices: (1) summary of EPR design parameters, (2) impurity control, (3) plasma computational models, (4) structural support system, (5) materials considerations for the primary energy conversion system, (6) magnetics, (7) neutronics penetration analysis, (8) first wall stress analysis, (9) enrichment of isotopes of hydrogen by cryogenic distillation, and (10) noncircular plasma considerations. (MOW)

  5. Model-robust experimental designs for the fractional polynomial ...

    African Journals Online (AJOL)

    Fractional polynomial response surface models are polynomial models whose powers are restricted to a small predefined set of rational numbers. Very often these models can give a good a fit to the data and much more plausible behavior between design points than the polynomial models. In this paper, we propose a ...

  6. Maximum information at minimum cost; A North Sea field development study with an experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Damsleth, E.; Hage, A (Norsk Hydro, Oslo (Norway)); Volden, R. (Norwegian Computing Center, Oslo (Norway))

    1992-12-01

    This paper reports on statistical design of experiments which is a technique used to maximize the information obtained form a minimum number of experiments. This technique, however, has not been used extensively in the oil industry. With a well-designed setup, the same information obtained when one parameter is varied at a time can be obtained with significantly fewer simulation runs. Interactions among the various input parameters can be identified and estimated with a more elaborate design. The technique can be applied without profound statistical insight with commercially available packages for statistical analysis.

  7. On the construction of experimental designs for a given task by jointly optimizing several quality criteria: Pareto-optimal experimental designs.

    Science.gov (United States)

    Sánchez, M S; Sarabia, L A; Ortiz, M C

    2012-11-19

    Experimental designs for a given task should be selected on the base of the problem being solved and of some criteria that measure their quality. There are several such criteria because there are several aspects to be taken into account when making a choice. The most used criteria are probably the so-called alphabetical optimality criteria (for example, the A-, E-, and D-criteria related to the joint estimation of the coefficients, or the I- and G-criteria related to the prediction variance). Selecting a proper design to solve a problem implies finding a balance among these several criteria that measure the performance of the design in different aspects. Technically this is a problem of multi-criteria optimization, which can be tackled from different views. The approach presented here addresses the problem in its real vector nature, so that ad hoc experimental designs are generated with an algorithm based on evolutionary algorithms to find the Pareto-optimal front. There is not theoretical limit to the number of criteria that can be studied and, contrary to other approaches, no just one experimental design is computed but a set of experimental designs all of them with the property of being Pareto-optimal in the criteria needed by the user. Besides, the use of an evolutionary algorithm makes it possible to search in both continuous and discrete domains and avoid the need of having a set of candidate points, usual in exchange algorithms. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    Science.gov (United States)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  9. Development and Validation of a Rubric for Diagnosing Students’ Experimental Design Knowledge and Difficulties

    Science.gov (United States)

    Dasgupta, Annwesa P.; Anderson, Trevor R.

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658

  10. How to design in situ studies: an evaluation of experimental protocols

    Directory of Open Access Journals (Sweden)

    Young-Hye Sung

    2014-08-01

    Full Text Available Objectives Designing in situ models for caries research is a demanding procedure, as both clinical and laboratory parameters need to be incorporated in a single study. This study aimed to construct an informative guideline for planning in situ models relevant to preexisting caries studies. Materials and Methods An electronic literature search of the PubMed database was performed. A total 191 of full articles written in English were included and data were extracted from materials and methods. Multiple variables were analyzed in relation to the publication types, participant characteristics, specimen and appliance factors, and other conditions. Frequencies and percentages were displayed to summarize the data and the Pearson's chi-square test was used to assess a statistical significance (p < 0.05. Results There were many parameters commonly included in the majority of in situ models such as inclusion criteria, sample sizes, sample allocation methods, tooth types, intraoral appliance types, sterilization methods, study periods, outcome measures, experimental interventions, etc. Interrelationships existed between the main research topics and some parameters (outcome measures and sample allocation methods among the evaluated articles. Conclusions It will be possible to establish standardized in situ protocols according to the research topics. Furthermore, data collaboration from comparable studies would be enhanced by homogeneous study designs.

  11. Gunnar Aagaard Andersen: Commercial Design and Experimental Art

    DEFF Research Database (Denmark)

    Gether, Vibeke Petersen

    2016-01-01

    Gunnar Aagaard Andersen’s relaxed approach both to art and to commercialised creativity positioned him centrally in the field where the two overlap. Other important players were the brothers Aage and Mads Eg Damgaard, who owned textile factories in Herning, the artistic textile firm Unika Væv...... in Copenhagen and the magazine Mobilia. These enterprises were pioneers within both the commercial–industrial creative field and experimental art from the 1950s. The industrial process, from idea and experiment to production,as well as the expanded field of visual art, was demonstrated in concrete painting...... and total decoration. These developments within the visual arts of the 1950s and 1960s were followed up by Paul Gadegaard, Dieter Roth, Arthur Köpcke and Paul Gernes, as well as by Aagaard Andersen himself....

  12. STATISTICAL MODELLING OF FDC AND RETURN PERIODS TO CHARACTERISE QDF AND DESIGN THRESHOLD OF HYDROLOGICAL EXTREMES

    Directory of Open Access Journals (Sweden)

    Charles Onyutha

    2012-01-01

    Full Text Available In this paper, firstly, flow duration curves (FDCs for hydrological extremes were calibrated for a range of aggregation levels and seasons to provide compressed statistical information for water resources management at selected temporal scales and seasons. Secondly, instead of the common approach of using return periods, T (years for deriving discharge duration frequency (QDF relationships, the method of using exceedance frequencies, E (% was introduced so as to provide answer to important question like, what is the streamflow at a given aggregation level and selected E (%? Thirdly, the concept of estimated design threshold (EDT was introduced and proposed for consideration in the risk analysis for design of water resources structures. This study was based on the long daily discharge record for the period 1950 - 2008 at station 1EF01 in Kenya, on the Nzoia river with watershed area of 12,676 km² located in the North Eastern quadrant of Lake Victoria Nile Sub Basin. In the statistical modelling of FDCs and T (years, suitable extreme value distributions (EVD were selected and calibrated to fit nearly independent high flows and low flows. The FDCs and T-curves were used to determine the EDT. The FDCs were used to model the QDF relationships. To derive QDF relationships of hydrological extremes, for a given range of aggregation levels, extreme value analysis (EVA was carried out and suitable EVD selected. Next was the calibration of parameters of the EVD and analysis of relationship between the model parameters and aggregation levels. Finally, smooth mathematical relationships were derived using little but acceptable modifications to the model parameters. Such constructed QDF relationships can be used for various applications to estimate cumulative volumes of water available during droughts or floods at various aggregation levels or E (% of hydrological extremes. The EDT when obtained for a range of aggregation levels can also be used to understand

  13. Experimental design and optimization of raloxifene hydrochloride loaded nanotransfersomes for transdermal application.

    Science.gov (United States)

    Mahmood, Syed; Taher, Muhammad; Mandal, Uttam Kumar

    2014-01-01

    Raloxifene hydrochloride, a highly effective drug for the treatment of invasive breast cancer and osteoporosis in post-menopausal women, shows poor oral bioavailability of 2%. The aim of this study was to develop, statistically optimize, and characterize raloxifene hydrochloride-loaded transfersomes for transdermal delivery, in order to overcome the poor bioavailability issue with the drug. A response surface methodology experimental design was applied for the optimization of transfersomes, using Box-Behnken experimental design. Phospholipon(®) 90G, sodium deoxycholate, and sonication time, each at three levels, were selected as independent variables, while entrapment efficiency, vesicle size, and transdermal flux were identified as dependent variables. The formulation was characterized by surface morphology and shape, particle size, and zeta potential. Ex vivo transdermal flux was determined using a Hanson diffusion cell assembly, with rat skin as a barrier medium. Transfersomes from the optimized formulation were found to have spherical, unilamellar structures, with a homogeneous distribution and low polydispersity index (0.08). They had a particle size of 134±9 nM, with an entrapment efficiency of 91.00%±4.90%, and transdermal flux of 6.5±1.1 μg/cm(2)/hour. Raloxifene hydrochloride-loaded transfersomes proved significantly superior in terms of amount of drug permeated and deposited in the skin, with enhancement ratios of 6.25±1.50 and 9.25±2.40, respectively, when compared with drug-loaded conventional liposomes, and an ethanolic phosphate buffer saline. Differential scanning calorimetry study revealed a greater change in skin structure, compared with a control sample, during the ex vivo drug diffusion study. Further, confocal laser scanning microscopy proved an enhanced permeation of coumarin-6-loaded transfersomes, to a depth of approximately160 μM, as compared with rigid liposomes. These ex vivo findings proved that a raloxifene hydrochloride

  14. End-point controller design for an experimental two-link flexible manipulator using convex optimization

    Science.gov (United States)

    Oakley, Celia M.; Barratt, Craig H.

    1990-01-01

    Recent results in linear controller design are used to design an end-point controller for an experimental two-link flexible manipulator. A nominal 14-state linear-quadratic-Gaussian (LQG) controller was augmented with a 528-tap finite-impulse-response (FIR) filter designed using convex optimization techniques. The resulting 278-state controller produced improved end-point trajectory tracking and disturbance rejection in simulation and experimentally in real time.

  15. Experimental design and quality assurance: in situ fluorescence instrumentation

    Science.gov (United States)

    Conmy, Robyn N.; Del Castillo, Carlos E.; Downing, Bryan D.; Chen, Robert F.

    2014-01-01

    Both instrument design and capabilities of fluorescence spectroscopy have greatly advanced over the last several decades. Advancements include solid-state excitation sources, integration of fiber optic technology, highly sensitive multichannel detectors, rapid-scan monochromators, sensitive spectral correction techniques, and improve data manipulation software (Christian et al., 1981, Lochmuller and Saavedra, 1986; Cabniss and Shuman, 1987; Lakowicz, 2006; Hudson et al., 2007). The cumulative effect of these improvements have pushed the limits and expanded the application of fluorescence techniques to numerous scientific research fields. One of the more powerful advancements is the ability to obtain in situ fluorescence measurements of natural waters (Moore, 1994). The development of submersible fluorescence instruments has been made possible by component miniaturization and power reduction including advances in light sources technologies (light-emitting diodes, xenon lamps, ultraviolet [UV] lasers) and the compatible integration of new optical instruments with various sampling platforms (Twardowski et at., 2005 and references therein). The development of robust field sensors skirt the need for cumbersome and or time-consuming filtration techniques, the potential artifacts associated with sample storage, and coarse sampling designs by increasing spatiotemporal resolution (Chen, 1999; Robinson and Glenn, 1999). The ability to obtain rapid, high-quality, highly sensitive measurements over steep gradients has revolutionized investigations of dissolved organic matter (DOM) optical properties, thereby enabling researchers to address novel biogeochemical questions regarding colored or chromophoric DOM (CDOM). This chapter is dedicated to the origin, design, calibration, and use of in situ field fluorometers. It will serve as a review of considerations to be accounted for during the operation of fluorescence field sensors and call attention to areas of concern when making

  16. Experimental burn plot trial in the Kruger National Park: history, experimental design and suggestions for data analysis

    Directory of Open Access Journals (Sweden)

    R. Biggs

    2003-12-01

    Full Text Available The experimental burn plot (EBP trial initiated in 1954 is one of few ongoing long-termfire ecology research projects in Africa. The trial aims to assess the impacts of differentfire regimes in the Kruger National Park. Recent studies on the EBPs have raised questions as to the experimental design of the trial, and the appropriate model specificationwhen analysing data. Archival documentation reveals that the original design was modified on several occasions, related to changes in the park's fire policy. These modifications include the addition of extra plots, subdivision of plots and changes in treatmentsover time, and have resulted in a design which is only partially randomised. The representativity of the trial plots has been questioned on account of their relatively small size,the concentration of herbivores on especially the frequently burnt plots, and soil variation between plots. It is suggested that these factors be included as covariates inexplanatory models or that certain plots be excluded from data analysis based on resultsof independent studies of these factors. Suggestions are provided for the specificationof the experimental design when analysing data using Analysis of Variance. It is concluded that there is no practical alternative to treating the trial as a fully randomisedcomplete block design.

  17. Experimental Design for Testing Local Lorentz Invariance Violations in Gravity

    Science.gov (United States)

    Chen, Ya-Fen; Tan, Yu-Jie; Shao, Cheng-Gang

    2017-09-01

    Local Lorentz invariance is an important component of General Relativity. Testing for Local Lorentz invariance can not only probe the foundation stone of General Relativity but also help to explore the unified theory for General Relativity and quantum mechanics. In this paper, we search the Local Lorentz invariance violation associated with operators of mass dimension d=6 in the pure-gravity sector with short-range gravitational experiments. To enlarge the Local Lorentz invariance violation signal effectively, we design a new experiment in which the constraints of all fourteen violation coefficients may be improved by about one order of magnitude

  18. Optimization of hydrogels for transdermal delivery of lisinopril by Box-Behnken statistical design.

    Science.gov (United States)

    Gannu, Ramesh; Yamsani, Vamshi Vishnu; Yamsani, Shravan Kumar; Palem, Chinna Reddy; Yamsani, Madhusudan Rao

    2009-01-01

    The aim of this study was to investigate the combined influence of three independent variables on the permeation kinetics of lisinopril from hydrogels for transdermal delivery. A three-factor, three-level Box-Behnken design was used to optimize the independent variables, Carbopol 971 P (X(1)), menthol (X(2)), and propylene glycol (X(3)). Fifteen batches were prepared and evaluated for responses as dependent variables. The dependent variables selected were cumulative amount permeated across rat abdominal skin in 24 h (Q (24); Y(1)), flux (Y(2)), and lag time (Y(3)). Aloe juice has been first time investigated as vehicle for hydrogel preparation. The ex vivo permeation study was conducted using Franz diffusion cells. Mathematical equations and response surface plots were used to relate the dependent and independent variables. The regression equation generated for the cumulative permeation of LSP in 24 h (Q(24)) was Y(1) = 1,443.3-602.59X(1) + 93.24X(2) + 91.75X(3) - 18.95X(1)X(2) - 140.93X(1)X(3) - 4.43X(2)X(3) - 152.63X(1)(2) - 150.03X(2)(2) - 213.9X(3)(2). The statistical validity of the polynomials was established, and optimized formulation factors were selected by feasibility and grid search. Validation of the optimization study with 15 confirmatory runs indicated high degree of prognostic ability of response surface methodology. The use of Box-Behnken design approach helped in identifying the critical formulation parameters in the transdermal delivery of lisinopril from hydrogels.

  19. Experimental Design of Formulations Utilizing High Dimensional Model Representation.

    Science.gov (United States)

    Li, Genyuan; Bastian, Caleb; Welsh, William; Rabitz, Herschel

    2015-07-23

    Many applications involve formulations or mixtures where large numbers of components are possible to choose from, but a final composition with only a few components is sought. Finding suitable binary or ternary mixtures from all the permissible components often relies on simplex-lattice sampling in traditional design of experiments (DoE), which requires performing a large number of experiments even for just tens of permissible components. The effect rises very rapidly with increasing numbers of components and can readily become impractical. This paper proposes constructing a single model for a mixture containing all permissible components from just a modest number of experiments. Yet the model is capable of satisfactorily predicting the performance for full as well as all possible binary and ternary component mixtures. To achieve this goal, we utilize biased random sampling combined with high dimensional model representation (HDMR) to replace DoE simplex-lattice design. Compared with DoE, the required number of experiments is significantly reduced, especially when the number of permissible components is large. This study is illustrated with a solubility model for solvent mixture screening.

  20. Human in vitro 3D co-culture model to engineer vascularized bone-mimicking tissues combining computational tools and statistical experimental approach.

    Science.gov (United States)

    Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo

    2016-01-01

    The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Optimization of Ficus deltoidea Using Ultrasound-Assisted Extraction by Box-Behnken Statistical Design

    Directory of Open Access Journals (Sweden)

    L. J. Ong

    2016-09-01

    Full Text Available In this study, the effect of extraction parameters (ethanol concentration, sonication time, and solvent-to-sample ratio on Ficus deltoidea leaves was investigated using ultrasound-assisted extraction by response surface methodology (RSM. Total phenolic content (TPC of F. deltoidea extracts was identified using Folin-Ciocalteu method and expressed in gallic acid equivalent (GAE per g. Box-Behnken statistical design (BBD was the tool used to find the optimal conditions for maximum TPC. Besides, the extraction yield was measured and stated in percentage. The optimized TPC attained was 455.78 mg GAE/g at 64% ethanol concentration, 10 minutes sonication time, and 20 mL/g solvent-to-sample ratio whereas the greatest extraction yield was 33% with ethanol concentration of 70%, sonication time of 40 minutes, and solvent-to-material ratio at 40 mL/g. The determination coefficient, R2, for TPC indicates that 99.5% capriciousness in the response could be clarified by the ANOVA model and the value of 0.9681 of predicted R2 is in equitable agreement with the 0.9890 of adjusted R2. The present study shows that ethanol water as solvent, a short time of 10 minutes, and adequate solvent-to-sample ratio (20 mL/g are the best conditions for extraction.

  2. Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests

    Science.gov (United States)

    Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.

    2015-01-01

    The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.

  3. Statistical media design for efficient polyhydroxyalkanoate production in Pseudomonas sp. MNNG-S.

    Science.gov (United States)

    Saranya, V; Rajeswari, V; Abirami, P; Poornimakkani, K; Suguna, P; Shenbagarathai, R

    2016-07-03

    Polyhydroxyalkanoate (PHA) is a promising polymer for various biomedical applications. There is a high need to improve the production rate to achieve end use. When a cost-effective production was carried out with cheaper agricultural residues like molasses, traces of toxins were incorporated into the polymer, which makes it unfit for biomedical applications. On the other hand, there is an increase in the popularity of using chemically defined media for the production of compounds with biomedical applications. However, these media do not exhibit favorable characteristics such as efficient utilization at large scale compared to complex media. This article aims to determine the specific nutritional requirement of Pseudomonas sp. MNNG-S for efficient production of polyhydroxyalkanoate. Response surface methodology (RSM) was used in this study to statistically design for PHA production based on the interactive effect of five significant variables (sucrose; potassium dihydrogen phosphate; ammonium sulfate; magnesium sulfate; trace elements). The interactive effects of sucrose with ammonium sulfate, ammonium sulfate with combined potassium phosphate, and trace element with magnesium sulfate were found to be significant (p < .001). The optimization approach adapted in this study increased the PHA production more than fourfold (from 0.85 g L(-1) to 4.56 g L(-1)).

  4. Patient reactions to personalized medicine vignettes: an experimental design.

    Science.gov (United States)

    Butrick, Morgan; Roter, Debra; Kaphingst, Kimberly; Erby, Lori H; Haywood, Carlton; Beach, Mary Catherine; Levy, Howard P

    2011-05-01

    Translational investigation on personalized medicine is in its infancy. Exploratory studies reveal attitudinal barriers to "race-based medicine" and cautious optimism regarding genetically personalized medicine. This study describes patient responses to hypothetical conventional, race-based, or genetically personalized medicine prescriptions. Three hundred eighty-seven participants (mean age = 47 years; 46% white) recruited from a Baltimore outpatient center were randomized to this vignette-based experimental study. They were asked to imagine a doctor diagnosing a condition and prescribing them one of three medications. The outcomes are emotional response to vignette, belief in vignette medication efficacy, experience of respect, trust in the vignette physician, and adherence intention. Race-based medicine vignettes were appraised more negatively than conventional vignettes across the board (Cohen's d = -0.51-0.57-0.64, P medicine (-0.14-0.15-0.17, P = 0.47), with the exception of reduced adherence intention to genetically personalized medicine (Cohen's d = -0.38-0.41-0.44, P = 0.009). This relative reluctance to take genetically personalized medicine was pronounced for racial minorities (Cohen's d = -0.38-0.31-0.25, P = 0.02) and was related to trust in the vignette physician (change in R = 0.23, P medicine technology, especially among racial minorities, and highlights enhancement of adherence through improved doctor- patient relationships.

  5. A projection method for under determined optimal experimental designs

    KAUST Repository

    Long, Quan

    2014-01-09

    A new implementation, based on the Laplace approximation, was developed in (Long, Scavino, Tempone, & Wang 2013) to accelerate the estimation of the post–experimental expected information gains in the model parameters and predictive quantities of interest. A closed–form approximation of the inner integral and the order of the corresponding dominant error term were obtained in the cases where the parameters are determined by the experiment. In this work, we extend that method to the general cases where the model parameters could not be determined completely by the data from the proposed experiments. We carry out the Laplace approximations in the directions orthogonal to the null space of the corresponding Jacobian matrix, so that the information gain (Kullback–Leibler divergence) can be reduced to an integration against the marginal density of the transformed parameters which are not determined by the experiments. Furthermore, the expected information gain can be approximated by an integration over the prior, where the integrand is a function of the projected posterior covariance matrix. To deal with the issue of dimensionality in a complex problem, we use Monte Carlo sampling or sparse quadratures for the integration over the prior probability density function, depending on the regularity of the integrand function. We demonstrate the accuracy, efficiency and robustness of the proposed method via several nonlinear under determined numerical examples.

  6. Experimental Design and Data Analysis Issues Contribute to Inconsistent Results of C-Bouton Changes in Amyotrophic Lateral Sclerosis.

    Science.gov (United States)

    Dukkipati, S Shekar; Chihi, Aouatef; Wang, Yiwen; Elbasiouny, Sherif M

    2017-01-01

    The possible presence of pathological changes in cholinergic synaptic inputs [cholinergic boutons (C-boutons)] is a contentious topic within the ALS field. Conflicting data reported on this issue makes it difficult to assess the roles of these synaptic inputs in ALS. Our objective was to determine whether the reported changes are truly statistically and biologically significant and why replication is problematic. This is an urgent question, as C-boutons are an important regulator of spinal motoneuron excitability, and pathological changes in motoneuron excitability are present throughout disease progression. Using male mice of the SOD1-G93A high-expresser transgenic (G93A) mouse model of ALS, we examined C-boutons on spinal motoneurons. We performed histological analysis at high statistical power, which showed no difference in C-bouton size in G93A versus wild-type motoneurons throughout disease progression. In an attempt to examine the underlying reasons for our failure to replicate reported changes, we performed further histological analyses using several variations on experimental design and data analysis that were reported in the ALS literature. This analysis showed that factors related to experimental design, such as grouping unit, sampling strategy, and blinding status, potentially contribute to the discrepancy in published data on C-bouton size changes. Next, we systematically analyzed the impact of study design variability and potential bias on reported results from experimental and preclinical studies of ALS. Strikingly, we found that practices such as blinding and power analysis are not systematically reported in the ALS field. Protocols to standardize experimental design and minimize bias are thus critical to advancing the ALS field.

  7. Statistical and dynamical analysis of RNA structures and complexes with applications to nanodevice design

    Science.gov (United States)

    Hastings, Whitney Allen

    This dissertation combines rigid body motion kinematics and statistical analysis techniques to extract information from detailed dynamic simulations and large databases of biomolecular structures. This information is then used to quantify and elucidate structural patterns that could be used to design functional nano-structures or provide new targets for ligand-based drug design. In this regard, three particular classes of problems are examined. First, we propose new methods for estimating the stiffness of continuum filament models of helical nucleic acid structures. In this work, molecular dynamics is used to sample RNA helices consisting of several base-pairs fluctuating about an equilibrium position. At equilibrium, each base-pair has a tightly clustered probability distribution and so we can describe the rigid body motion of the helix as the convolution of highly concentrated probability densities on SE(3). Second, the structure and dynamics of a common RNA non-helical motif is classified. We examine several RNA bulges with varying sequences and helix curvature, and establish degrees of similarity (and dissimilarity) in the bulge motif according to the nucleic acid type of the bulge and surrounding base-pairs. Both the "static" X-ray-crystal and NMR structures and the dynamics generated from molecular dynamics simulations are used to quantify the flexibility and conservative aspects of the motif. The resulting classification scheme provides bulge motifs that could be included in a toolbox of "nanostructures" where one could pick the pieces to design a structure that has the needed shape and desired behavior. Finally, we analyze a large collection of adenosine binding sites, focusing on the functional region of the binding site. We provide a new analysis tool that finds spatial patterns in adenosine binding pockets by examining the relative pose (position and orientation) between the adenosine ligand and the amino acids at each binding site. The similarities of

  8. A modified experimental hut design for studying responses of disease-transmitting mosquitoes to indoor interventions: the Ifakara experimental huts.

    Directory of Open Access Journals (Sweden)

    Fredros O Okumu

    Full Text Available Differences between individual human houses can confound results of studies aimed at evaluating indoor vector control interventions such as insecticide treated nets (ITNs and indoor residual insecticide spraying (IRS. Specially designed and standardised experimental huts have historically provided a solution to this challenge, with an added advantage that they can be fitted with special interception traps to sample entering or exiting mosquitoes. However, many of these experimental hut designs have a number of limitations, for example: 1 inability to sample mosquitoes on all sides of huts, 2 increased likelihood of live mosquitoes flying out of the huts, leaving mainly dead ones, 3 difficulties of cleaning the huts when a new insecticide is to be tested, and 4 the generally small size of the experimental huts, which can misrepresent actual local house sizes or airflow dynamics in the local houses. Here, we describe a modified experimental hut design - The Ifakara Experimental Huts- and explain how these huts can be used to more realistically monitor behavioural and physiological responses of wild, free-flying disease-transmitting mosquitoes, including the African malaria vectors of the species complexes Anopheles gambiae and Anopheles funestus, to indoor vector control-technologies including ITNs and IRS. Important characteristics of the Ifakara experimental huts include: 1 interception traps fitted onto eave spaces and windows, 2 use of eave baffles (panels that direct mosquito movement to control exit of live mosquitoes through the eave spaces, 3 use of replaceable wall panels and ceilings, which allow safe insecticide disposal and reuse of the huts to test different insecticides in successive periods, 4 the kit format of the huts allowing portability and 5 an improved suite of entomological procedures to maximise data quality.

  9. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    Science.gov (United States)

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  10. Adequacy of different experimental designs for eucalyptus spacing trials in Portuguese environmental conditions

    Science.gov (United States)

    Paula Soares; Margarida Tome

    2000-01-01

    In Portugal, several eucalyptus spacing trials cover a relatively broad range of experimental designs: trials with a non-randomized block design with plots of different size and number of trees per plot; trials based on a non-systematic design in which spacings were randomized resulting in a factorial arrangement with plots of different size and shape and equal number...

  11. Phase equilibrium of liquid mixtures: experimental and modeled data using statistical associating fluid theory for potential of variable range approach.

    Science.gov (United States)

    Giner, Beatriz; Bandrés, Isabel; López, M Carmen; Lafuente, Carlos; Galindo, Amparo

    2007-10-14

    A study of the phase equilibrium (experimental and modeled) of mixtures formed by a cyclic ether and haloalkanes has been derived. Experimental data for the isothermal vapor liquid equilibrium of mixtures formed by tetrahydrofuran and tetrahydropyran and isomeric chlorobutanes at temperatures of 298.15, 313.15, and 328.15 K are presented. Experimental results have been discussed in terms of both molecular characteristics of pure compounds and potential intermolecular interaction between them using thermodynamic information of the mixtures obtained earlier. The statistical associating fluid theory for potential of variable range (SAFT-VR) approach together with standard combining rules without adjustable parameters has been used to model the phase equilibrium. Good agreement between experiment and the prediction is found with such a model. Mean absolute deviations for pressures are of the order of 1 kPa, while less than 0.013 mole fraction for vapor phase compositions. In order to improve the results obtained, a new modeling has been carried out by introducing a unique transferable parameter k(ij), which modifies the strength of the dispersion interaction between unlike components in the mixtures, and is valid for all the studied mixtures being not temperature or pressure dependent. This parameter together with the SAFT-VR approach provides a description of the vapor-liquid equilibrium of the mixtures that is in excellent agreement with the experimental data for most cases. The absolute deviations are of the order of 0.005 mole fraction for vapor phase compositions and less than 0.3 kPa for pressure, excepting for mixtures containing 2-chloro-2-methylpropane which deviations for pressure are larger. Results obtained in this work in the modeling of the phase equilibrium with the SAFT-VR equation of state have been compared to the ones obtained in a previous study when the approach was used to model similar mixtures with clear differences in the thermodynamic behavior

  12. A retrospective survey of research design and statistical analyses in selected Chinese medical journals in 1998 and 2008.

    Science.gov (United States)

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-05-25

    High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, pdesign also decreased ( = 21.22, pdesign with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, pdesigns. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.

  13. Design and experimental tests of free electron laser wire scanners

    Directory of Open Access Journals (Sweden)

    G. L. Orlandi

    2016-09-01

    Full Text Available SwissFEL is a x-rays free electron laser (FEL driven by a 5.8 GeV linac under construction at Paul Scherrer Institut. In SwissFEL, wire scanners (WSCs will be complementary to view-screens for emittance measurements and routinely used to monitor the transverse profile of the electron beam during FEL operations. The SwissFEL WSC is composed of an in-vacuum beam-probe—motorized by a stepper motor—and an out-vacuum pick-up of the wire signal. The mechanical stability of the WSC in-vacuum hardware has been characterized on a test bench. In particular, the motor induced vibrations of the wire have been measured and mapped for different motor speeds. Electron-beam tests of the entire WSC setup together with different wire materials have been carried out at the 250 MeV SwissFEL Injector Test Facility (SITF, Paul Scherrer Institut, CH and at FERMI (Elettra-Sincrotrone Trieste, Italy. In particular, a comparative study of the relative measurement accuracy and the radiation-dose release of Al(99∶Si(1 and tungsten (W wires has been carried out. On the basis of the outcome of the bench and electron-beam tests, the SwissFEL WSC can be qualified as a high resolution and machine-saving diagnostic tool in consideration of the mechanical stability of the scanning wire at the micrometer level and the choice of the wire material ensuring a drastic reduction of the radiation-dose release with respect to conventional metallic wires. The main aspects of the design, laboratory characterization and electron beam tests of the SwissFEL WSCs are presented.

  14. Design and experimental tests of free electron laser wire scanners

    Science.gov (United States)

    Orlandi, G. L.; Heimgartner, P.; Ischebeck, R.; Loch, C. Ozkan; Trovati, S.; Valitutti, P.; Schlott, V.; Ferianis, M.; Penco, G.

    2016-09-01

    SwissFEL is a x-rays free electron laser (FEL) driven by a 5.8 GeV linac under construction at Paul Scherrer Institut. In SwissFEL, wire scanners (WSCs) will be complementary to view-screens for emittance measurements and routinely used to monitor the transverse profile of the electron beam during FEL operations. The SwissFEL WSC is composed of an in-vacuum beam-probe—motorized by a stepper motor—and an out-vacuum pick-up of the wire signal. The mechanical stability of the WSC in-vacuum hardware has been characterized on a test bench. In particular, the motor induced vibrations of the wire have been measured and mapped for different motor speeds. Electron-beam tests of the entire WSC setup together with different wire materials have been carried out at the 250 MeV SwissFEL Injector Test Facility (SITF, Paul Scherrer Institut, CH) and at FERMI (Elettra-Sincrotrone Trieste, Italy). In particular, a comparative study of the relative measurement accuracy and the radiation-dose release of Al (99 )∶Si (1 ) and tungsten (W) wires has been carried out. On the basis of the outcome of the bench and electron-beam tests, the SwissFEL WSC can be qualified as a high resolution and machine-saving diagnostic tool in consideration of the mechanical stability of the scanning wire at the micrometer level and the choice of the wire material ensuring a drastic reduction of the radiation-dose release with respect to conventional metallic wires. The main aspects of the design, laboratory characterization and electron beam tests of the SwissFEL WSCs are presented.

  15. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    Science.gov (United States)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical

  16. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  17. Long-term strategy for the statistical design of a forest health monitoring system

    Science.gov (United States)

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  18. Experimental determination of new statistical correlations for the calculation of the heat transfer coefficient by convection for flat plates, cylinders and tube banks

    Directory of Open Access Journals (Sweden)

    Ismael Fernando Meza Castro

    2017-07-01

    Full Text Available Introduction: This project carried out an experimental research with the design, assembly, and commissioning of a convection heat transfer test bench. Objective: To determine new statistical correlations that allow knowing the heat transfer coefficients by air convection with greater accuracy in applications with different heating geometry configurations. Methodology: Three geometric configurations, such as flat plate, cylinders and tube banks were studied according to their physical properties through Reynolds and Prandtl numbers, using a data transmission interface using Arduino® controllers Measured the air temperature through the duct to obtain real-time data and to relate the heat transferred from the heating element to the fluid and to perform mathematical modeling in specialized statistical software. The study was made for the three geometries mentioned, one power per heating element and two air velocities with 10 repetitions. Results: Three mathematical correlations were obtained with regression coefficients greater than 0.972, one for each heating element, obtaining prediction errors in the heat transfer convective coefficients of 7.50% for the flat plate, 2.85% for the plate Cylindrical and 1.57% for the tube bank. Conclusions: It was observed that in geometries constituted by several individual elements, a much more accurate statistical adjustment was obtained to predict the behavior of the convection heat coefficients, since each unit reaches a stability in the surface temperature profile with Greater speed, giving the geometry in general, a more precise measurement of the parameters that govern the transfer of heat, as it is in the case of the geometry of the tube bank.

  19. Delineamento experimental e tamanho de amostra para alface cultivada em hidroponia Experimental design and sample size for hydroponic lettuce crop

    Directory of Open Access Journals (Sweden)

    Valéria Schimitz Marodim

    2000-10-01

    Full Text Available Este estudo visa a estabelecer o delineamento experimental e o tamanho de amostra para a cultura da alface (Lactuca sativa em hidroponia, pelo sistema NFT (Nutrient film technique. O experimento foi conduzido no Laboratório de Cultivos Sem Solo/Hidroponia, no Departamento de Fitotecnia da Universidade Federal de Santa Maria e baseou-se em dados de massa de plantas. Os resultados obtidos mostraram que, usando estrutura de cultivo de alface em hidroponia sobre bancadas de fibrocimento com seis canais, o delineamento experimental adequado é blocos ao acaso se a unidade experimental for constituída de faixas transversais aos canais das bancadas, e deve ser inteiramente casualizado se a bancada for a unidade experimental; para a variável massa de plantas, o tamanho da amostra é de 40 plantas para uma semi-amplitude do intervalo de confiança em percentagem da média (d igual a 5% e de 7 plantas para um d igual a 20%.This study was carried out to establish the experimental design and sample size for hydroponic lettuce (Lactuca sativa crop under nutrient film technique. The experiment was conducted in the Laboratory of Hydroponic Crops of the Horticulture Department of the Federal University of Santa Maria. The evaluated traits were plant weight. Under hydroponic conditions on concrete bench with six ducts, the most indicated experimental design for lettuce is randomised blocks for duct transversal plots or completely randomised for bench plot. The sample size for plant weight should be 40 and 7 plants, respectively, for a confidence interval of mean percentage (d equal to 5% and 20%.

  20. Design, construction and testing of a radon experimental chamber; Diseno, construccion y pruebas de una camara experimental de radon

    Energy Technology Data Exchange (ETDEWEB)

    Chavez B, A.; Balcazar G, M

    1991-10-15

    To carry out studies on the radon behavior under controlled and stable conditions it was designed and constructed a system that consists of two parts: a container of mineral rich in Uranium and an experimentation chamber with radon united one to the other one by a step valve. The container of uranium mineral approximately contains 800 gr of uranium with a law of 0.28%; the radon gas emanated by the mineral is contained tightly by the container. When the valve opens up the radon gas it spreads to the radon experimental chamber; this contains 3 accesses that allow to install different types of detectors. The versatility of the system is exemplified with two experiments: 1. With the radon experimental chamber and an associated spectroscopic system, the radon and two of its decay products are identified. 2. The design of the system allows to couple the mineral container to other experimental geometries to demonstrate this fact it was coupled and proved a new automatic exchanger system of passive detectors of radon. The results of the new automatic exchanger system when it leave to flow the radon freely among the container and the automatic exchanger through a plastic membrane of 15 m. are shown. (Author)

  1. Experimental Design for Evaluating the Safety Benefits of Railroad Advance Warning Signs

    Science.gov (United States)

    1979-04-01

    The report presents the findings and conclusions of a study to develop an experimental design and analysis plan for field testing and evaluation of the accident reduction potential of a proposed new railroad grade crossing advance warning sign. Sever...

  2. Design and Development of a Testing Device for Experimental Measurements of Foundation Slabs on the Subsoil

    National Research Council Canada - National Science Library

    Čajka, Radim; Křivý, Vít; Sekanina, David

    2011-01-01

    The paper deals with technical solutions and construction of a testing stand designed for experimental measurements of deformations and state of stress of foundation structures placed on the subsoil...

  3. Evaluating clinical and public health interventions: a practical guide to study design and statistics

    National Research Council Canada - National Science Library

    Katz, Mitchell H

    2010-01-01

    .... Because the choice of research design depends on the nature of the intervention, the book covers randomized and nonrandomized designs, prospective and retrospective studies, planned clinical trials...

  4. Design and experimental tests of a novel neutron spin analyzer for wide angle spin echo spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Fouquet, Peter; Farago, Bela; Andersen, Ken H.; Bentley, Phillip M.; Pastrello, Gilles; Sutton, Iain; Thaveron, Eric; Thomas, Frederic [Institut Laue-Langevin, BP 156, F-38042 Grenoble Cedex 9 (France); Moskvin, Evgeny [Helmholtzzentrum Berlin, Glienicker Strasse 100, D-14109 Berlin (Germany); Pappas, Catherine [Helmholtzzentrum Berlin, Glienicker Strasse 100, D-14109 Berlin (Germany); Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands)

    2009-09-15

    This paper describes the design and experimental tests of a novel neutron spin analyzer optimized for wide angle spin echo spectrometers. The new design is based on nonremanent magnetic supermirrors, which are magnetized by vertical magnetic fields created by NdFeB high field permanent magnets. The solution presented here gives stable performance at moderate costs in contrast to designs invoking remanent supermirrors. In the experimental part of this paper we demonstrate that the new design performs well in terms of polarization, transmission, and that high quality neutron spin echo spectra can be measured.

  5. Statistics of Scientific Procedures on Living Animals 2014: A new format, and hopefully a new era of diminishing animal experimentation?

    Science.gov (United States)

    Hudson-Shore, Michelle

    2016-03-01

    The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2014 reports a welcome decline in animal experimentation in the UK. However, caution has to be exercised when interpreting these most recent figures, due to the significant changes made to satisfy the requirements of Directive 2010/63/EU as to what information is reported and how it is reported. Comparisons to the figures and trends reported in previous years is difficult, so this paper focuses on the specifics of the current report, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, fish and primates. There is a detailed discussion of the extent of the changes, commenting on the benefits and disadvantages of the new format, in areas such as severity of procedures, legislation and techniques of special interest. It also considers the consequences of the changes on the effective monitoring of laboratory animal use, the openness and transparency regarding the impacts of animal use, and the implementation of Three Rs initiatives. In addition, suggestions for further improvements to the new format are made to the Home Office. 2016 FRAME.

  6. Bayesian experimental design of a multichannel interferometer for Wendelstein 7-X.

    Science.gov (United States)

    Dreier, H; Dinklage, A; Fischer, R; Hirsch, M; Kornejew, P

    2008-10-01

    Bayesian experimental design (BED) is a framework for the optimization of diagnostics basing on probability theory. In this work it is applied to the design of a multichannel interferometer at the Wendelstein 7-X stellarator experiment. BED offers the possibility to compare diverse designs quantitatively, which will be shown for beam-line designs resulting from different plasma configurations. The applicability of this method is discussed with respect to its computational effort.

  7. Web based learning support for experimental design in molecular biology: a top-down approach

    NARCIS (Netherlands)

    Aegerter-Wilmsen, T.; Hartog, R.; Bisseling, T.

    2003-01-01

    An important learning goal of a molecular biology curriculum is the attainment of a certain competence level in experimental design. Currently, undergraduate students are confronted with experimental approaches in textbooks, lectures and laboratory courses. However, most students do not reach a

  8. Experimental Device for Learning of Logical Circuit Design using Integrated Circuits

    OpenAIRE

    石橋, 孝昭

    2012-01-01

    This paper presents an experimental device for learning of logical circuit design using integrated circuits and breadboards. The experimental device can be made at a low cost and can be used for many subjects such as logical circuits, computer engineering, basic electricity, electrical circuits and electronic circuits. The proposed device is effective to learn the logical circuits than the usual lecture.

  9. Establishing the experimenting society : The historical origin of social experimentation according to the randomized controlled design

    NARCIS (Netherlands)

    Dehue, T

    2001-01-01

    This article tl aces the historical origin of social experimentation. It highlights the central role of psychology in establishing the randomized controlled design and its quasi-experimental derivatives. The author investigates the differences in the 19th- and 20th-century meaning of the expression

  10. Ti film deposition process of a plasma focus: Study by an experimental design

    Directory of Open Access Journals (Sweden)

    M. J. Inestrosa-Izurieta

    2017-10-01

    Full Text Available The plasma generated by plasma focus (PF devices have substantially different physical characteristics from another plasma, energetic ions and electrons, compared with conventional plasma devices used for plasma nanofabrication, offering new and unique opportunities in the processing and synthesis of Nanomaterials. This article presents the use of a plasma focus of tens of joules, PF-50J, for the deposition of materials sprayed from the anode by the plasma dynamics in the axial direction. This work focuses on the determination of the most significant effects of the technological parameters of the system on the obtained depositions through the use of a statistical experimental design. The results allow us to give a qualitative understanding of the Ti film deposition process in our PF device depending on four different events provoked by the plasma dynamics: i an electric erosion of the outer material of the anode; ii substrate ablation generating an interlayer; iii electron beam deposition of material from the center of the anode; iv heat load provoking clustering or even melting of the deposition surface.

  11. Optimization of primaquine diphosphate tablet formulation for controlled drug release using the mixture experimental design.

    Science.gov (United States)

    Duque, Marcelo Dutra; Kreidel, Rogério Nepomuceno; Taqueda, Maria Elena Santos; Baby, André Rolim; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Consiglieri, Vladi Olga

    2013-01-01

    A tablet formulation based on hydrophilic matrix with a controlled drug release was developed, and the effect of polymer concentrations on the release of primaquine diphosphate was evaluated. To achieve this purpose, a 20-run, four-factor with multiple constraints on the proportions of the components was employed to obtain tablet compositions. Drug release was determined by an in vitro dissolution study in phosphate buffer solution at pH 6.8. The polynomial fitted functions described the behavior of the mixture on simplex coordinate systems to study the effects of each factor (polymer) on tablet characteristics. Based on the response surface methodology, a tablet composition was optimized with the purpose of obtaining a primaquine diphosphate release closer to a zero order kinetic. This formulation released 85.22% of the drug for 8 h and its kinetic was studied regarding to Korsmeyer-Peppas model, (Adj-R(2) = 0.99295) which has confirmed that both diffusion and erosion were related to the mechanism of the drug release. The data from the optimized formulation were very close to the predictions from statistical analysis, demonstrating that mixture experimental design could be used to optimize primaquine diphosphate dissolution from hidroxypropylmethyl cellulose and polyethylene glycol matrix tablets.

  12. I.4 Screening Experimental Designs for Quantitative Trait Loci, Association Mapping, Genotype-by Environment Interaction, and Other Investigations

    Science.gov (United States)

    Federer, Walter T.; Crossa, José

    2012-01-01

    Crop breeding programs using conventional approaches, as well as new biotechnological tools, rely heavily on data resulting from the evaluation of genotypes in different environmental conditions (agronomic practices, locations, and years). Statistical methods used for designing field and laboratory trials and for analyzing the data originating from those trials need to be accurate and efficient. The statistical analysis of multi-environment trails (MET) is useful for assessing genotype × environment interaction (GEI), mapping quantitative trait loci (QTLs), and studying QTL × environment interaction (QEI). Large populations are required for scientific study of QEI, and for determining the association between molecular markers and quantitative trait variability. Therefore, appropriate control of local variability through efficient experimental design is of key importance. In this chapter we present and explain several classes of augmented designs useful for achieving control of variability and assessing genotype effects in a practical and efficient manner. A popular procedure for unreplicated designs is the one known as “systematically spaced checks.” Augmented designs contain “c” check or standard treatments replicated “r” times, and “n” new treatments or genotypes included once (usually) in the experiment. PMID:22675304

  13. OTIMIZATION OF TRANSESTERIFICATION DOUBLE STEP PROCESS (TDSP) TO THE PRODUCTION OF BIODIESEL THROUGH DOEHLERT EXPERIMENTAL DESIGN

    OpenAIRE

    Ruschel, Carla Felippi Chiella; Ferrão, Marco Flôres; Santos, Francisco Paulo dos; Samios, Dimitrios

    2016-01-01

    In this work, Doehlert experimental design was used to optimize the Transesterification Double Step Process (TDSP) method of methyl soybean oil biodiesel production which starts with a basic catalysis followed by an acidic catalysis. The conversion values were calculated from NMR spectra. Response surface was used to show the results of the interactions between the variables. This experimental design evaluated variables like catalyst and alcohol amount for the basic catalysis and time and tem...

  14. Statistical issues for design and analysis of single-arm multi-stage phase II cancer clinical trials.

    Science.gov (United States)

    Jung, Sin-Ho

    2015-05-01

    Phase II trials have been very widely conducted and published every year for cancer clinical research. In spite of the fast progress in design and analysis methods, single-arm two-stage design is still the most popular for phase II cancer clinical trials. Because of their small sample sizes, statistical methods based on large sample approximation are not appropriate for design and analysis of phase II trials. As a prospective clinical research, the analysis method of a phase II trial is predetermined at the design stage and it is analyzed during and at the end of the trial as planned by the design. The analysis method of a trial should be matched with the design method. For two-stage single arm phase II trials, Simon's method has been the standards for choosing an optimal design, but the resulting data have been analyzed and published ignoring the two-stage design aspect with small sample sizes. In this article, we review analysis methods that exactly get along with the exact two-stage design method. We also discuss some statistical methods to improve the existing design and analysis methods for single-arm two-stage phase II trials. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Measuring and Advancing Experimental Design Ability in an Introductory Course without Altering Existing Lab Curriculum.

    Science.gov (United States)

    Shanks, Ryan A; Robertson, Chuck L; Haygood, Christian S; Herdliksa, Anna M; Herdliska, Heather R; Lloyd, Steven A

    2017-01-01

    Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT) and uncovered two latent factors which provide valid means to assess this overlay model's ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.

  16. Measuring and Advancing Experimental Design Ability in an Introductory Course without Altering Existing Lab Curriculum

    Directory of Open Access Journals (Sweden)

    Ryan A. Shanks

    2017-05-01

    Full Text Available Introductory biology courses provide an important opportunity to prepare students for future courses, yet existing cookbook labs, although important in their own way, fail to provide many of the advantages of semester-long research experiences. Engaging, authentic research experiences aid biology students in meeting many learning goals. Therefore, overlaying a research experience onto the existing lab structure allows faculty to overcome barriers involving curricular change. Here we propose a working model for this overlay design in an introductory biology course and detail a means to conduct this lab with minimal increases in student and faculty workloads. Furthermore, we conducted exploratory factor analysis of the Experimental Design Ability Test (EDAT and uncovered two latent factors which provide valid means to assess this overlay model’s ability to increase advanced experimental design abilities. In a pre-test/post-test design, we demonstrate significant increases in both basic and advanced experimental design abilities in an experimental and comparison group. We measured significantly higher gains in advanced experimental design understanding in students in the experimental group. We believe this overlay model and EDAT factor analysis contribute a novel means to conduct and assess the effectiveness of authentic research experiences in an introductory course without major changes to the course curriculum and with minimal increases in faculty and student workloads.

  17. Inverse wing design for the scaled supersonic experimental airplane with ensuring design constraints

    OpenAIRE

    Matsushima, Kisa; Iwamiya, Toshiyuki; Zhang, Wanqiu; 松島 紀佐; 岩宮 敏幸; Zhang, Wanqiu

    2007-01-01

    Aerodynamic shape of a wing for NAL (National Aerospace Laboratory)'s first SST (SuperSonic Transport) model has been designed by a supersonic inverse design method. This method handles wing-fuselage configurations and provides wing section's geometry at every span for Navier-Stokes flowfields. The design target is a NLF (Natural Laminar Flow) wing at the speed of M(sub infinity) = 2.0. The original system of the inverse design method has to be modified so that several design constraints can ...

  18. Estimation of design wave heights based on exterme value statistics for Kakinada coast, Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    Chandramohan, P.; Nayak, B.U.; Raju, N.S.N.

    Statistical analyses for longterm distribution of significant wave heights were performed using Lognormal, Weibull, Gumbel and Fretcher distributions for waves measured off Kakinada, Andhra Pradesh, India from June 1983 to May 1984. Fretcher...

  19. Development and Validation of a Rubric for Diagnosing Students' Experimental Design Knowledge and Difficulties.

    Science.gov (United States)

    Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  20. Statistically based sustainable re-design of stormwater overflow control systems in urban catchments

    Science.gov (United States)

    Ganora, Daniele; Isacco, Silvia; Claps, Pierluigi

    2017-04-01

    Control and reduction of pollution from stormwater overflow is a major concern for municipalities to manage the quality of the receiving water bodies according to the Framework Water Directive 2000/60/CE. In this regard, assessment studies of the potential pollution load from sewer networks recognize the need for adaptation and upgrade of existing drainage systems, which can be achieved with either traditional water works (detention tanks, increase of wastewater treatment plant capacity, etc.) or even Nature-based solutions (constructed wetlands, restored floodplains, etc.) sometimes used in combination. Nature-based solutions are recently receiving consistent attentions as they are able to enhance urban and degraded environments being, in the same time, more resilient and adaptable to climatic and anthropic changes than most traditional engineering works. On the other hand, restoration of the urban environment using natural absorbing surfaces requires diffuse interventions, high costs and a considerable amount of time. In this work we investigate how simple, economically-sustainable and quick solutions to the problem at hand can be addressed by changes in the management rules when pumping stations play a role in sewer systems. In particular, we provide a statistically-based framework to be used in the calibration of the management rules, facing improved quality of overflows from sewer systems. Typical pumping rules favor a massive delivery of stormwater volumes to the wastewater treatment plans, requiring large storage tanks in the sewer network, heavy pumping power and reducing the efficiency of the treatment plant due to pollutant dilution. In this study we show that it is possible to optimize the pumping rule in order to reduce pumped volumes to the plant (thus saving energy), while simultaneously keeping high pollutant concentration. On the other hand, larger low-concentration overflow volumes are released outside the sewer network with respect to the standard

  1. Statistical design and analysis for plant cover studies with multiple sources of observation errors

    Science.gov (United States)

    Wright, Wilson; Irvine, Kathryn M.; Warren, Jeffrey M .; Barnett, Jenny K.

    2017-01-01

    Effective wildlife habitat management and conservation requires understanding the factors influencing distribution and abundance of plant species. Field studies, however, have documented observation errors in visually estimated plant cover including measurements which differ from the true value (measurement error) and not observing a species that is present within a plot (detection error). Unlike the rapid expansion of occupancy and N-mixture models for analysing wildlife surveys, development of statistical models accounting for observation error in plants has not progressed quickly. Our work informs development of a monitoring protocol for managed wetlands within the National Wildlife Refuge System.Zero-augmented beta (ZAB) regression is the most suitable method for analysing areal plant cover recorded as a continuous proportion but assumes no observation errors. We present a model extension that explicitly includes the observation process thereby accounting for both measurement and detection errors. Using simulations, we compare our approach to a ZAB regression that ignores observation errors (naïve model) and an “ad hoc” approach using a composite of multiple observations per plot within the naïve model. We explore how sample size and within-season revisit design affect the ability to detect a change in mean plant cover between 2 years using our model.Explicitly modelling the observation process within our framework produced unbiased estimates and nominal coverage of model parameters. The naïve and “ad hoc” approaches resulted in underestimation of occurrence and overestimation of mean cover. The degree of bias was primarily driven by imperfect detection and its relationship with cover within a plot. Conversely, measurement error had minimal impacts on inferences. We found >30 plots with at least three within-season revisits achieved reasonable posterior probabilities for assessing change in mean plant cover.For rapid adoption and application, code

  2. Optimization of fast disintegration tablets using pullulan as diluent by central composite experimental design

    Directory of Open Access Journals (Sweden)

    Dipil Patel

    2012-01-01

    Full Text Available The objective of this work was to apply central composite experimental design to investigate main and interaction effect of formulation parameters in optimizing novel fast disintegration tablets formulation using pullulan as diluents. Face centered central composite experimental design was employed to optimize fast disintegration tablet formulation. The variables studied were concentration of diluents (pullulan, X1, superdisintigrant (sodium starch glycolate, X2, and direct compression aid (spray dried lactose, X3. Tablets were characterized for weight variation, thickness, disintegration time (Y1 and hardness (Y2. Good correlation between the predicted values and experimental data of the optimized formulation methodology in optimizing fast disintegrating tablets using pullulan as a diluent.

  3. Optimization of fast disintegration tablets using pullulan as diluent by central composite experimental design.

    Science.gov (United States)

    Patel, Dipil; Chauhan, Musharraf; Patel, Ravi; Patel, Jayvadan

    2012-03-01

    The objective of this work was to apply central composite experimental design to investigate main and interaction effect of formulation parameters in optimizing novel fast disintegration tablets formulation using pullulan as diluents. Face centered central composite experimental design was employed to optimize fast disintegration tablet formulation. The variables studied were concentration of diluents (pullulan, X(1)), superdisintigrant (sodium starch glycolate, X(2)), and direct compression aid (spray dried lactose, X(3)). Tablets were characterized for weight variation, thickness, disintegration time (Y(1)) and hardness (Y(2)). Good correlation between the predicted values and experimental data of the optimized formulation methodology in optimizing fast disintegrating tablets using pullulan as a diluent.

  4. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    design. It incorporates a model driven approach to the experimental design that minimises the number of experiments to be performed, while still generating accurate values of kinetic parameters. The approach has been illustrated with the transketolase mediated asymmetric synthesis of L......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  5. Application of Iterative Robust Model-based Optimal Experimental Design for the Calibration of Biocatalytic Models

    DEFF Research Database (Denmark)

    Van Daele, Timothy; Gernaey, Krist V.; Ringborg, Rolf Hoffmeyer

    2017-01-01

    The aim of model calibration is to estimate unique parameter values from available experimental data, here applied to a biocatalytic process. The traditional approach of first gathering data followed by performing a model calibration is inefficient, since the information gathered during...... experimentation is not actively used to optimise the experimental design. By applying an iterative robust model-based optimal experimental design, the limited amount of data collected is used to design additional informative experiments. The algorithm is used here to calibrate the initial reaction rate of an ω......-transaminase catalysed reaction in a more accurate way. The parameter confidence region estimated from the Fisher Information Matrix is compared with the likelihood confidence region, which is a more accurate, but also a computationally more expensive method. As a result, an important deviation between both approaches...

  6. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  7. Experimental and Theoretical Progress of Linear Collider Final Focus Design and ATF2 Facility

    CERN Document Server

    Seryi, Andrei; Zimmermann, Frank; Kubo, Kiyoshi; Kuroda, Shigeru; Okugi, Toshiyuki; Tauchi, Toshiaki; Terunuma, Nobuhiro; Urakawa, Junji; White, Glen; Woodley, Mark; Angal-Kalinin, Deepa

    2014-01-01

    In this brief overview we will reflect on the process of the design of the linear collider (LC) final focus (FF) optics, and will also describe the theoretical and experimental efforts on design and practical realisation of a prototype of the LC FF optics implemented in the ATF2 facility at KEK, Japan, presently being commissioned and operated.

  8. Overview of design development of FCC-hh Experimental Interaction Regions

    CERN Document Server

    AUTHOR|(CDS)2082479; Abelleira, Jose; Cruz Alaniz, Emilia; Van Riesen-Haupt, Leon; Benedikt, Michael; Besana, Maria Ilaria; Buffat, Xavier; Burkhardt, Helmut; Cerutti, Francesco; Langner, Andy Sven; Martin, Roman; Riegler, Werner; Schulte, Daniel; Tomas Garcia, Rogelio; Appleby, Robert Barrie; Rafique, Haroon; Barranco Garcia, Javier; Pieloni, Tatiana; Boscolo, Manuela; Collamati, Francesco; Nevay, Laurence James; Hofer, Michael

    2017-01-01

    The experimental interaction region (EIR) is one of the key areas that define the performance of the Future Circular Collider. In this overview we will describe the status and the evolution of the design of EIR of FCC-hh, focusing on design of the optics, energy deposition in EIR elements, beam-beam effects and machine detector interface issues.

  9. Experimental and Statistical Study on Machinability of the Composite Materials with Metal Matrix Al/B4C/Graphite

    Science.gov (United States)

    Nas, Engin; Gökkaya, Hasan

    2017-10-01

    In this study, four types of Al/B4C/Graphite metal matrix composites (MMCs) were produced by means of a hot-pressing technique with reinforcement elements, B4C 8 wt pct and graphite (nickel coated) 0, 3, 5, and 7 wt pct. Machinability tests of MMC materials thus produced were conducted using four different cutting speeds (100, 140, 180, and 220 m/min), three different feed rates (0.1, 0.15, and 0.20 mm/rev), and a fixed cutting depth (0.5 mm), and the effects of the cutting parameters on the average surface roughness were examined. After the machinability tests, the height of the built-up edge (BUE) formed on the cutting tools related to the cutting speed and feed rate was measured. The test results were examined by designing a matrix according to the full factorial design and the average surface roughness, and the most important factors leading to formation of the BUE were analyzed by the analysis of variance (ANOVA). As a result of analysis, it was found that the lowest surface roughness value was with 7 wt pct graphite MMC material, while the highest was without graphite powder. Based on the statistical analysis results, it was observed that the most important factor affecting average surface roughness was the type of MMC material, the second most effective factor was the feed rate, and the least effective factor was the cutting speed. Furthermore, it was found that the most important factor affecting the formation of the BUE was the type of MMC material, the second most effective factor was the cutting speed, and the least effective factor was the feed rate.

  10. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  11. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  12. Visualizing Experimental Designs for Balanced ANOVA Models using Lisp-Stat

    Directory of Open Access Journals (Sweden)

    Philip W. Iversen

    2004-12-01

    Full Text Available The structure, or Hasse, diagram described by Taylor and Hilton (1981, American Statistician provides a visual display of the relationships between factors for balanced complete experimental designs. Using the Hasse diagram, rules exist for determining the appropriate linear model, ANOVA table, expected means squares, and F-tests in the case of balanced designs. This procedure has been implemented in Lisp-Stat using a software representation of the experimental design. The user can interact with the Hasse diagram to add, change, or delete factors and see the effect on the proposed analysis. The system has potential uses in teaching and consulting.

  13. Wind refrigeration : design and results of an experimental facility; Refrigeracion eolica: Diseno y resultados de una instalacion experimental

    Energy Technology Data Exchange (ETDEWEB)

    Beltran, R. G.; Talero, A.

    2004-07-01

    This article describes the experimental setup used to obtain design parameters for a wind driven refrigeration equipment. The system compressor is directly coupled to the wind mill and will provide refrigeration to a community located in La Guajira in northern Colombia. The testing on the experimental installation assessed the refrigeration capacity that could be provided by an open type commercial compressor coupled to the wind mill axis. Power and torque requirements have been evaluated for different wind mill rotational speeds. An assessment of the local conditions relating to wind speed, frequency and preferred direction for the installation site has been made based on measurements by the Meteorological National Institute and independent data from other sources. (Author)

  14. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  15. An Empirical Study of Parameter Estimation for Stated Preference Experimental Design

    Directory of Open Access Journals (Sweden)

    Fei Yang

    2014-01-01

    Full Text Available The stated preference experimental design can affect the reliability of the parameters estimation in discrete choice model. Some scholars have proposed some new experimental designs, such as D-efficient, Bayesian D-efficient. But insufficient empirical research has been conducted on the effectiveness of these new designs and there has been little comparative analysis of the new designs against the traditional designs. In this paper, a new metro connecting Chengdu and its satellite cities is taken as the research subject to demonstrate the validity of the D-efficient and Bayesian D-efficient design. Comparisons between these new designs and orthogonal design were made by the fit of model and standard deviation of parameters estimation; then the best model result is obtained to analyze the travel choice behavior. The results indicate that Bayesian D-efficient design works better than D-efficient design. Some of the variables can affect significantly the choice behavior of people, including the waiting time and arrival time. The D-efficient and Bayesian D-efficient design for MNL can acquire reliability result in ML model, but the ML model cannot develop the theory advantages of these two designs. Finally, the metro can handle over 40% passengers flow if the metro will be operated in the future.

  16. Neurath, Arntz, and ISOTYPE: the legacy in art, design, and statistics

    NARCIS (Netherlands)

    Jansen, W.|info:eu-repo/dai/nl/072692626

    2009-01-01

    the fi rst decades of the twentieth century, Otto Neurath and Gerd Arntz invented the ‘ Vienna Method of Pictorial Statistics ’ (Wiener Bildstatistik). The method was renamed in the late 1930s as ISOTYPE ― ‘ I(nternational) S(ystem) O(f) TY(pographic) P(icture) E(ducation) ’ ― and was used in the

  17. Static Numbers to Dynamic Statistics: Designing a Policy-Friendly Social Policy Indicator Framework

    Science.gov (United States)

    Ahn, Sang-Hoon; Choi, Young Jun; Kim, Young-Mi

    2012-01-01

    In line with the economic crisis and rapid socio-demographic changes, the interest in "social" and "well-being" indicators has been revived. Social indicator movements of the 1960s resulted in the establishment of social indicator statistical frameworks; that legacy has remained intact in many national governments and…

  18. Statistical Reform: Evidence-Based Practice, Meta-Analyses, and Single Subject Designs

    Science.gov (United States)

    Jenson, William R.; Clark, Elaine; Kircher, John C.; Kristjansson, Sean D.

    2007-01-01

    Evidence-based practice approaches to interventions has come of age and promises to provide a new standard of excellence for school psychologists. This article describes several definitions of evidence-based practice and the problems associated with traditional statistical analyses that rely on rejection of the null hypothesis for the…

  19. There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics

    Science.gov (United States)

    Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri

    2006-01-01

    ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…

  20. Must a process be in statistical control before conducting designed experiments?

    NARCIS (Netherlands)

    Bisgaard, S.

    2008-01-01

    Fisher demonstrated three quarters of a century ago that the three key concepts of randomization, blocking, and replication make it possible to conduct experiments on processes that are not necessarily in a state of statistical control. However, even today there persists confusion about whether

  1. Informal Statistics Help Desk

    Science.gov (United States)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  2. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    , however, not limited to this development context, it may be applied for in-house or contract development as well. In system development, particularly in cooperative and experimental system development, we argue that it is necessary to analytically separate the abstract concerns, e.g. analysis, design......This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis/design...

  3. Experimental Modelling of the Breakdown Voltage of Air Using Design of Experiments

    Directory of Open Access Journals (Sweden)

    REZOUGA, M.

    2009-02-01

    Full Text Available Many experimental and numerical studies were devoted to the electric discharge of air, and some mathematical models were proposed for the critical breakdown voltage. As this latter depends on several parameters, it is difficult to find a formula, theoretical or experimental, which considers many factors. The aim of this paper is to model the critical breakdown voltage in a "Sphere-Sphere� electrodes system by using the methodology of experimental designs. Several factors were considered, such as geometrical factors (inter-electrodes interval, diameter of the electrodes and climatic factors (temperature, humidity. Two factorial centred faces experimental designs (CCF were carried out, a first one for the geometrical factors and a second one for the climatic factors. The obtained results made it possible to propose mathematical models and to study the interactions between the various factors.

  4. Preference option randomized design (PORD) for comparative effectiveness research: Statistical power for testing comparative effect, preference effect, selection effect, intent-to-treat effect, and overall effect.

    Science.gov (United States)

    Heo, Moonseong; Meissner, Paul; Litwin, Alain H; Arnsten, Julia H; McKee, M Diane; Karasz, Alison; McKinley, Paula; Rehm, Colin D; Chambers, Earle C; Yeh, Ming-Chin; Wylie-Rosett, Judith

    2017-01-01

    Comparative effectiveness research trials in real-world settings may require participants to choose between preferred intervention options. A randomized clinical trial with parallel experimental and control arms is straightforward and regarded as a gold standard design, but by design it forces and anticipates the participants to comply with a randomly assigned intervention regardless of their preference. Therefore, the randomized clinical trial may impose impractical limitations when planning comparative effectiveness research trials. To accommodate participants' preference if they are expressed, and to maintain randomization, we propose an alternative design that allows participants' preference after randomization, which we call a "preference option randomized design (PORD)". In contrast to other preference designs, which ask whether or not participants consent to the assigned intervention after randomization, the crucial feature of preference option randomized design is its unique informed consent process before randomization. Specifically, the preference option randomized design consent process informs participants that they can opt out and switch to the other intervention only if after randomization they actively express the desire to do so. Participants who do not independently express explicit alternate preference or assent to the randomly assigned intervention are considered to not have an alternate preference. In sum, preference option randomized design intends to maximize retention, minimize possibility of forced assignment for any participants, and to maintain randomization by allowing participants with no or equal preference to represent random assignments. This design scheme enables to define five effects that are interconnected with each other through common design parameters-comparative, preference, selection, intent-to-treat, and overall/as-treated-to collectively guide decision making between interventions. Statistical power functions for testing

  5. Statistical Modelling and Characterization of Experimental mm-Wave Indoor Channels for Future 5G Wireless Communication Networks

    Science.gov (United States)

    Al-Samman, A. M.; Rahman, T. A.; Azmi, M. H.; Hindia, M. N.; Khan, I.; Hanafi, E.

    2016-01-01

    This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios. PMID:27654703

  6. Optimal experimental design for improving the estimation of growth parameters of Lactobacillus viridescens from data under non-isothermal conditions.

    Science.gov (United States)

    Longhi, Daniel Angelo; Martins, Wiaslan Figueiredo; da Silva, Nathália Buss; Carciofi, Bruno Augusto Mattar; de Aragão, Gláucia Maria Falcão; Laurindo, João Borges

    2017-01-02

    In predictive microbiology, the model parameters have been estimated using the sequential two-step modeling (TSM) approach, in which primary models are fitted to the microbial growth data, and then secondary models are fitted to the primary model parameters to represent their dependence with the environmental variables (e.g., temperature). The Optimal Experimental Design (OED) approach allows reducing the experimental workload and costs, and the improvement of model identifiability because primary and secondary models are fitted simultaneously from non-isothermal data. Lactobacillus viridescens was selected to this study because it is a lactic acid bacterium of great interest to meat products preservation. The objectives of this study were to estimate the growth parameters of L. viridescens in culture medium from TSM and OED approaches and to evaluate both the number of experimental data and the time needed in each approach and the confidence intervals of the model parameters. Experimental data for estimating the model parameters with TSM approach were obtained at six temperatures (total experimental time of 3540h and 196 experimental data of microbial growth). Data for OED approach were obtained from four optimal non-isothermal profiles (total experimental time of 588h and 60 experimental data of microbial growth), two profiles with increasing temperatures (IT) and two with decreasing temperatures (DT). The Baranyi and Roberts primary model and the square root secondary model were used to describe the microbial growth, in which the parameters b and Tmin (±95% confidence interval) were estimated from the experimental data. The parameters obtained from TSM approach were b=0.0290 (±0.0020) [1/(h(0.5)°C)] and Tmin=-1.33 (±1.26) [°C], with R(2)=0.986 and RMSE=0.581, and the parameters obtained with the OED approach were b=0.0316 (±0.0013) [1/(h(0.5)°C)] and Tmin=-0.24 (±0.55) [°C], with R(2)=0.990 and RMSE=0.436. The parameters obtained from OED approach

  7. Designing of adaptive computer aided learning system of tasks for probabilistic statistical branch of mathematics

    Directory of Open Access Journals (Sweden)

    С Н Дворяткина

    2013-12-01

    Full Text Available This article focuses on the development of a model of adaptive learning system problems in probability and statistics branches of mathematics, based on ICT, which takes into account the shortcomings of modern educational systems, namely: they are highly specialized to a pre-rigid structure, closed, static, focused on the target audience and do not take into account dynamic characteristics of individual student.

  8. A case study on the design and development of minigames for research methods and statistics

    OpenAIRE

    P. Van Rosmalen; E.A. Boyle; J. Van der Baaren; A.I. Kärki; Ángel del Blanco Aguado

    2014-01-01

    Research methodology involves logical reasoning and critical thinking skills which are core competences in developing a more sophisticated understanding of the world. Acquiring expertise in research methods and statistics is not easy and poses a significant challenge for many students. The subject material is challenging because it is highly abstract and complex and requires the coordination of different but inter-related knowledge and skills that are all necessary to develop a coherent and u...

  9. Blended learning pedagogy designed for communication module among undergraduate nursing students: A quasi-experimental study.

    Science.gov (United States)

    Shorey, Shefaly; Kowitlawakul, Yanika; Devi, M Kamala; Chen, Hui-Chen; Soong, Swee Kit Alan; Ang, Emily

    2018-02-01

    Effective communication is important for nurse and patient outcomes. Nursing students often feel unprepared to communicate effectively with patients and other healthcare workers within the clinical environment. Blended learning pedagogy-based communication skills training can provide an alternative to traditional methods of teaching to enhance students' satisfaction and self-efficacy levels in communicating with others. To examine the effectiveness of blended learning pedagogy in a redesigned communication module among nursing undergraduates in enhancing their satisfaction levels and attitudes towards learning communication module as well as self-efficacy in communication. A single group pre-test and post-test quasi-experimental design was adopted. Data were collected from August 2016 to November 2016 from 124 nursing undergraduates from a leading nursing school. Blended learning pedagogy was adopted to redesign a communication module that offered a wide array of learning opportunities via face-to-face classroom and online sessions. Validated and reliable instruments were used to measure satisfaction levels with blended learning pedagogy, attitudes towards learning communication, and communication self-efficacy. Descriptive and inferential statistics were used to analyze the data. Participants had enhanced satisfaction levels with blended learning pedagogy, better attitudes in learning communication skills, and improved communication self-efficacies at posttest (week 13 of the semester) when compared with their pre-test scores (week one of the semester). Participants scored higher in the Blended Learning Satisfaction Scale, the Communication Skills Attitude Scale, and the communication skills subscale of the Nursing Students Self-Efficacy Scale. Blended learning pedagogy can be effectively used in facilitating communication modules and enhancing student outcomes among nursing undergraduates. The long-term effectiveness of using blended learning pedagogy in

  10. Experimental measurement and numerical analysis on resonant characteristics of piezoelectric disks with partial electrode designs.

    Science.gov (United States)

    Lin, Yu-Chih; Ma, Chien-Ching

    2004-08-01

    Three experimental techniques are used in this study to access the influence of the electrode arrangement on the resonant characteristics of piezoceramic disks. These methods, including the amplitude-fluctuation electronic speckle pattern interferometry (AF-ESPI), laser Doppler vibrometer-dynamic signal analyzer (LDV-DSA), and impedance analysis, are based on the measurement of full-field displacement, pointwise displacement, and electric impedance, respectively. In this study, one full electrode design and three nonsymmetrical partial electrode designs of piezoelectric disks are investigated. Because the clear fringe patterns measured by the AF-ESPI method will be shown only at resonant frequencies, both the resonant frequencies and the corresponding vibration mode shapes are successfully obtained at the same time for out-of-plane and in-plane motions. The second experimental method is the impedance analysis, which is used to measure the resonant and antiresonant frequencies. In addition to these experimental methods, LDV-DSA is used to determine the resonant frequencies of the vibration mode with out-of-plane motion. From the experimental results, the dependence of electrode design on the vibration frequencies and mode shapes is addressed. Numerical computations based on the finite element method are presented, and the results are compared with the experimental measurements. The effect of different designs of electrode is more significant in the in-plane modes than that in the out-of-plane modes.

  11. Development of the Neuron Assessment for Measuring Biology Students’ Use of Experimental Design Concepts and Representations

    Science.gov (United States)

    Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy J.

    2016-01-01

    Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students’ competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new “experimentation assessments,” 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. PMID:27146159

  12. Design of Experimental Data Publishing Software for Neutral Beam Injector on EAST

    Science.gov (United States)

    Zhang, Rui; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Zhang, Xiaodan; Wu, Deyun

    2015-02-01

    Neutral Beam Injection (NBI) is one of the most effective means for plasma heating. Experimental Data Publishing Software (EDPS) is developed to publish experimental data to get the NBI system under remote monitoring. In this paper, the architecture and implementation of EDPS including the design of the communication module and web page display module are presented. EDPS is developed based on the Browser/Server (B/S) model, and works under the Linux operating system. Using the data source and communication mechanism of the NBI Control System (NBICS), EDPS publishes experimental data on the Internet.

  13. A statistical evaluation of the design and precision of the shrimp trawl survey off West Greenland

    DEFF Research Database (Denmark)

    Folmer, Ole; Pennington, M.

    2000-01-01

    Stocks of Pandalus borealis off West Greenland have been assessed using a research trawl survey since 1988. The survey has used a design of randomly placed stations, stratified ton depth data where available, using small blocks elsewhere), with sampling effort proportional to stratum area. In some...... years, a two-stage adaptive sampling scheme was used to place more stations into strata with large first-stage variation in catches. The design of the survey was reviewed in 1998. Modifications in survey design suggested were to shorten tow duration, to pool strata so that effort could be allocated more...... efficiently, to put a higher proportion of stations in high- density areas and to abandon two-stage sampling. All these changes were implemented for the 1998 survey, except that tow duration was reduced to 30 min at 25% of the stations. To analyze the-efficiency of the present survey design, various...

  14. Experimental Investigations of Decentralised Control Design for The Stabilisation of Rotor-Gas Bearings

    DEFF Research Database (Denmark)

    Theisen, Lukas Roy Svane; Galeazzi, Roberto; Niemann, Hans Henrik

    2015-01-01

    directions. Hardening and softening P-lead controllers are designed based on the models experimentally identified, and salient features of both controllers are discussed. Both controllers are implemented and validated on the physical test rig. Experimental results confirm the validity of the proposed......-Box identification for the design of stabilising controllers, capable of enabling the active lubrication of the journal. The root locus analysis shows that two different control solutions are feasible for the dampening of the first two eigenfrequencies of the rotor-gas bearing in the horizontal and vertical...

  15. Experimental designs for evaluation of genetic variability and selection of ancient grapevine varieties: a simulation study.

    Science.gov (United States)

    Gonçalves, E; St Aubyn, A; Martins, A

    2010-06-01

    Classical methodologies for grapevine selection used in the vine-growing world are generally based on comparisons among a small number of clones. This does not take advantage of the entire genetic variability within ancient varieties, and therefore limits selection challenges. Using the general principles of plant breeding and of quantitative genetics, we propose new breeding strategies, focussed on conservation and quantification of genetic variability by performing a cycle of mass genotypic selection prior to clonal selection. To exploit a sufficiently large amount of genetic variability, initial selection trials must be generally very large. The use of experimental designs adequate for those field trials has been intensively recommended for numerous species. However, their use in initial trials of grapevines has not been studied. With the aim of identifying the most suitable experimental designs for quantification of genetic variability and selection of ancient varieties, a study was carried out to assess through simulation the comparative efficiency of various experimental designs (randomized complete block design, alpha design and row-column (RC) design). The results indicated a greater efficiency for alpha and RC designs, enabling more precise estimates of genotypic variance, greater precision in the prediction of genetic gain and consequently greater efficiency in genotypic mass selection.

  16. Conceptual design of superconducting magnet systems for the Argonne Tokamak Experimental Power Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wang, S.T.; Turner, L.R.; Mills, F.E.; DeMichele, D.W.; Smelser, P.; Kim, S.H.

    1976-01-01

    As an integral effort in the Argonne Tokamak Experimental Power Reactor Conceptual Design, the conceptual design of a 10-tesla, pure-tension superconducting toroidal-field (TF) coil system has been developed in sufficient detail to define a realistic design for the TF coil system that could be built based upon the current state of technology with minimum technological extrapolations. A conceptual design study on the superconducting ohmic-heating (OH) coils and the superconducting equilibrium-field (EF) coils were also completed. These conceptual designs are developed in sufficient detail with clear information on high current ac conductor design, cooling, venting provision, coil structural support and zero loss poloidal coil cryostat design. Also investigated is the EF penetration into the blanket and shield.

  17. Economic Statistical Design of integrated X-bar-S control chart with Preventive Maintenance and general failure distribution.

    Science.gov (United States)

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC.

  18. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing.

    Science.gov (United States)

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-12-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.

  19. Rational design of vaccine targets and strategies for HIV: a crossroad of statistical physics, biology, and medicine

    Science.gov (United States)

    Chakraborty, Arup K.; Barton, John P.

    2017-03-01

    Vaccination has saved more lives than any other medical procedure. Pathogens have now evolved that have not succumbed to vaccination using the empirical paradigms pioneered by Pasteur and Jenner. Vaccine design strategies that are based on a mechanistic understanding of the pertinent immunology and virology are required to confront and eliminate these scourges. In this perspective, we describe just a few examples of work aimed to achieve this goal by bringing together approaches from statistical physics with biology and clinical research.

  20. Optimization of the Gas Turbine-Modular Helium Reactor using statistical methods to maximize performance without compromising system design margins

    Energy Technology Data Exchange (ETDEWEB)

    Lommers, L.J.; Parme, L.L.; Shenoy, A.S. [General Atomics, San Diego, CA (United States)

    1995-12-31

    This paper describes a statistical approach for determining the impact of system performance and design uncertainties on power plant performance. The objectives of this design approach are to ensure that adequate margin is provided, that excess margin is minimized, and that full advantage can be taken of unconsumed margin. It is applicable to any thermal system in which these factors are important. The method is demonstrated using the Gas Turbine-Modular Helium Reactor as an example. The quantitative approach described allows the characterization of plant performance and the specification of the system design requirements necessary to achieve the desired performance with high confidence. Performance variations due to design evolution, in service degradation, and basic performance uncertainties are considered. The impact of all performance variabilities is combined using Monte Carlo analysis to predict the range of expected operation.

  1. Optimization of the gas turbine-modular helium reactor using statistical methods to maximize performance without compromising system design margins

    Energy Technology Data Exchange (ETDEWEB)

    Lommers, L.J.; Parme, L.L.; Shenoy, A.S.

    1995-07-01

    This paper describes a statistical approach for determining the impact of system performance and design uncertainties on power plant performance. The objectives of this design approach are to ensure that adequate margin is provided, that excess margin is minimized, and that full advantage can be taken of unconsumed margin. It is applicable to any thermal system in which these factors are important. The method is demonstrated using the Gas Turbine Modular Helium Reactor as an example. The quantitative approach described allows the characterization of plant performance and the specification of the system design requirements necessary to achieve the desired performance with high confidence. Performance variations due to design evolution, inservice degradation, and basic performance uncertainties are considered. The impact of all performance variabilities is combined using Monte Carlo analysis to predict the range of expected operation.

  2. Statistical analysis in the design of nuclear fuel cells; Analisis estadistico en el diseno de celdas de combustible nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Castillo M, J. A.; Ortiz S, J. J.; Montes T, J. L.; Perusquia del Cueto, R., E-mail: alejandro.castillo@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2012-10-15

    This work presents the preliminary results of a statistical analysis carried out for the design of nuclear fuel cells. The analysis consists in verifying the behavior of a cell, related with the frequency of the pines used for its design. In this preliminary study was analyzed the behavior of infinite multiplication factor and the peak factor of local power. On the other hand, the mentioned analysis was carried out using a pines group of enriched uranium previously established, for which varies the pines frequency used in the design. To carry out the study, the CASMO-IV code was used. The obtained designs are for the different axial areas of a fuel assembly. A balance cycle of the unit 1 of the nuclear power plant of Laguna Verde was used like reference. To obtain the result of the present work, systems that are already had and in which have already been implemented the heuristic techniques of ant colonies, neural networks and a hybrid between the dispersed search and the trajectories re-chaining. The results show that is possible to design nuclear fuel cells with a good performance, if is considered a statistical behavior in the frequency of the used pines, in a same way. (Author)

  3. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    Science.gov (United States)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  4. A Quasi-Experimental Study on Using Short Stories: Statistical and Inferential Analyses on the Non-English Major University Students' Speaking and Writing Achievements

    Science.gov (United States)

    Iman, Jaya Nur

    2017-01-01

    This research was conducted to find out whether or not using short stories significantly improve the speaking and writing achievements. A quasi-experimental study of non-equivalent pretest-posttest control group design or comparison group design was used in this research. The population of this research was the all first semester undergraduate…

  5. Quantification of pore size distribution using diffusion NMR: experimental design and physical insights.

    Science.gov (United States)

    Katz, Yaniv; Nevo, Uri

    2014-04-28

    Pulsed field gradient (PFG) diffusion NMR experiments are sensitive to restricted diffusion within porous media and can thus reveal essential microstructural information about the confining geometry. Optimal design methods of inverse problems are designed to select preferred experimental settings to improve parameter estimation quality. However, in pore size distribution (PSD) estimation using NMR methods as in other ill-posed problems, optimal design strategies and criteria are scarce. We formulate here a new optimization framework for ill-posed problems. This framework is suitable for optimizing PFG experiments for probing geometries that are solvable by the Multiple Correlation Function approach. The framework is based on a heuristic methodology designed to select experimental sets which balance between lowering the inherent ill-posedness and increasing the NMR signal intensity. This method also selects favorable discrete pore sizes used for PSD estimation. Numerical simulations performed demonstrate that using this framework greatly improves the sensitivity of PFG experimental sets to the pores' sizes. The optimization also sheds light on significant features of the preferred experimental sets. Increasing the gradient strength and varying multiple experimental parameters is found to be preferable for reducing the ill-posedness. We further evaluate the amount of pore size information that can be obtained by wisely selecting the duration of the diffusion and mixing times. Finally, we discuss the ramification of using single PFG or double PFG sequences for PSD estimation. In conclusion, the above optimization method can serve as a useful tool for experimenters interested in quantifying PSDs of different specimens. Moreover, the applicability of the suggested optimization framework extends far beyond the field of PSD estimation in diffusion NMR, and reaches design of sampling schemes of other ill-posed problems.

  6. Predicting the ballistic strength of aramid fiber composites by implementing full factorial experimental design

    OpenAIRE

    Dimeski, Dimko; Srebrenkoska, Vineta

    2014-01-01

    The purpose of the study is to predict the ballistic strength of hard aramid fiber/phenolic ballistic composites by implementing the full factorial experimental design. When designing ballistic composites two major factors are the most important: the ballistic strength and the weight of the protection. The ultimate target is to achieve the required ballistic strength with the lowest possible weight of protection. The hard ballistic aramid/phenolic composites were made by open mold...

  7. Experimental Analyses for The Mechanical Behavior of Pressed All-Ceramic Molar Crowns with Anatomical Design

    OpenAIRE

    Porojan Liliana; Porojan Sorin; Rusu Lucian; Boloş Adrian; Savencu Cristina

    2017-01-01

    Ceramic restorations show considerable variation in strength and structural reliability regarding to the type of material, and design characteristics. The fracture of ceramics occurs with little or no plastic deformation, with cracks propagated in an unstable manner under applied tensile stresses. The aim of the study was to assess experimental analyses of pressed monolithic ceramic crowns with anatomical design used in the posterior areas in order to understand their mechanical behavior befo...

  8. Experimental Design and Validation of an Accelerated Random Vibration Fatigue Testing Methodology

    OpenAIRE

    Yu Jiang(Center for Statistical and Theoretical Condensed Matter Physics, Zhejiang Normal University, Jinhua City, Zhejiang Province 321004, China); Gun Jin Yun; Li Zhao; Junyong Tao

    2015-01-01

    Novel accelerated random vibration fatigue test methodology and strategy are proposed, which can generate a design of the experimental test plan significantly reducing the test time and the sample size. Based on theoretical analysis and fatigue damage model, several groups of random vibration fatigue tests were designed and conducted with the aim of investigating effects of both Gaussian and non-Gaussian random excitation on the vibration fatigue. First, stress responses at a weak point of a ...

  9. Fermentation-Assisted Extraction of Isothiocyanates from Brassica Vegetable Using Box-Behnken Experimental Design

    Directory of Open Access Journals (Sweden)

    Amit K. Jaiswal

    2016-11-01

    Full Text Available Recent studies showed that Brassica vegetables are rich in numerous health-promoting compounds such as carotenoids, polyphenols, flavonoids, and glucosinolates (GLS, as well as isothiocyanates (ITCs and are involved in health promotion upon consumption. ITCs are breakdown products of GLS, and typically used in the food industry as a food preservative and colouring agent. They are also used in the pharmaceutical industry due to their several pharmacological properties such as antibacterial, antifungal, antiprotozoal, anti-inflammatory, and chemoprotective effects, etc. Due to their widespread application in food and pharmaceuticals, the present study was designed to extract ITCs from York cabbage. In order to optimise the fermentation-assisted extraction process for maximum yield of ITCs from York cabbage, Box-Behnken design (BBD combined with response surface methodology (RSM was applied. Additionally, the GLS content of York cabbage was quantified and the effect of lactic acid bacteria (LAB on GLS was evaluated. A range of GLS such as glucoraphanin, glucoiberin, glucobrassicin, sinigrin, gluconapin, neoglucobrassicin and 4-methoxyglucobrassicin were identified and quantified in fresh York cabbage. The experimental data obtained were fitted to a second-order polynomial equation using multiple regression analysis, and also examined by appropriate statistical methods. LAB facilitated the degradation of GLS, and the consequent formation of breakdown products such as ITCs. Results showed that the solid-to-liquid (S/L ratio, fermentation time and agitation rate had a significant effect on the yield of ITCs (2.2 times increment. The optimum fermentation conditions to achieve a higher ITCs extraction yield were: S/L ratio of 0.25 w/v, fermentation time of 36 h, and agitation rate of 200 rpm. The obtained yields of ITCs (45.62 ± 2.13 μM sulforaphane equivalent (SFE/mL were comparable to the optimised conditions, indicating the accuracy of the model

  10. Fermentation-Assisted Extraction of Isothiocyanates from Brassica Vegetable Using Box-Behnken Experimental Design.

    Science.gov (United States)

    Jaiswal, Amit K; Abu-Ghannam, Nissreen

    2016-11-04

    Recent studies showed that Brassica vegetables are rich in numerous health-promoting compounds such as carotenoids, polyphenols, flavonoids, and glucosinolates (GLS), as well as isothiocyanates (ITCs) and are involved in health promotion upon consumption. ITCs are breakdown products of GLS, and typically used in the food industry as a food preservative and colouring agent. They are also used in the pharmaceutical industry due to their several pharmacological properties such as antibacterial, antifungal, antiprotozoal, anti-inflammatory, and chemoprotective effects, etc. Due to their widespread application in food and pharmaceuticals, the present study was designed to extract ITCs from York cabbage. In order to optimise the fermentation-assisted extraction process for maximum yield of ITCs from York cabbage, Box-Behnken design (BBD) combined with response surface methodology (RSM) was applied. Additionally, the GLS content of York cabbage was quantified and the effect of lactic acid bacteria (LAB) on GLS was evaluated. A range of GLS such as glucoraphanin, glucoiberin, glucobrassicin, sinigrin, gluconapin, neoglucobrassicin and 4-methoxyglucobrassicin were identified and quantified in fresh York cabbage. The experimental data obtained were fitted to a second-order polynomial equation using multiple regression analysis, and also examined by appropriate statistical methods. LAB facilitated the degradation of GLS, and the consequent formation of breakdown products such as ITCs. Results showed that the solid-to-liquid (S/L) ratio, fermentation time and agitation rate had a significant effect on the yield of ITCs (2.2 times increment). The optimum fermentation conditions to achieve a higher ITCs extraction yield were: S/L ratio of 0.25 w/v, fermentation time of 36 h, and agitation rate of 200 rpm. The obtained yields of ITCs (45.62 ± 2.13 μM sulforaphane equivalent (SFE)/mL) were comparable to the optimised conditions, indicating the accuracy of the model for the

  11. Statistical controversies in clinical research: early-phase adaptive design for combination immunotherapies.

    Science.gov (United States)

    Wages, N A; Slingluff, C L; Petroni, G R

    2017-04-01

    In recent years, investigators have asserted that the 3 + 3 design lacks flexibility, making its use in modern early-phase trial settings, such as combinations and/or biological agents, inefficient. More innovative approaches are required to address contemporary research questions, such as those posed in trials involving immunotherapies. We describe the implementation of an adaptive design for identifying an optimal treatment regimen, defined by low toxicity and high immune response, in an early-phase trial of a melanoma helper peptide vaccine plus novel adjuvant combinations. Operating characteristics demonstrate the ability of the method to effectively recommend optimal regimens in a high percentage of trials with reasonable sample sizes. The proposed design is a practical, early-phase, adaptive method for use with combined immunotherapy regimens. This design can be applied more broadly to early-phase combination studies, as it was used in an ongoing study of two small molecule inhibitors in relapsed/refractory mantle cell lymphoma.

  12. An introduction to design-based research with an example from statistics education

    NARCIS (Netherlands)

    Bakker, Arthur; van Eerde, Henriette

    2015-01-01

    This chapter arose from the need to introduce researchers, including Master and PhD students, to design-based research (DBR). In Part 1 we address key features of DBR and differences from other research approaches. We also describe the meaning of validity and reliability in DBR and discuss how they

  13. Comparison of statistical methods, type of articles and study design used in selected Pakistani medical journals in 1998 and 2007.

    Science.gov (United States)

    Rao, Masood Hussain; Khan, Nazeer

    2010-09-01

    To compare the statistical methods, types of article and design of studies used in 1998 and 2007 articles of leading indexed and non-indexed medical journals of Pakistan. Six leading medical journals of Pakistan: (1) JCPSP, (2) JPMA, (3) JAMC, (4) PJMS, (5) PJMR and (6) PAFMJ, were selected for this study. Articles reviewed were 1057 to achieve the above mentioned objective. The articles reviewed for 1998 and 2007 were 366 and 691, respectively. Original articles contributed the maximum percentage of 65.6%, followed by case reports with 24.8%. The contribution of case reports in 1998 was 20.5% which increased to 27.1% in 2007. There was no statistically significant difference between 'indexed' and 'non-indexed' journals for different type of statistical methods in 1998 or 2007. In total, 749 articles were categorized as 'original articles' or 'short communication'. Among them, 51% articles mentioned study design and 67.3% of them were correct for the respective methodology. In 1998, 202 (74%) articles did not use any statistics or indicated only descriptive statistics, while in 2007, 239 (50.2%) articles did the same. The reader who was familiar with t-test and contingency tables in 1998 could have understood 97.4% of the scientific articles. However, this percentage dropped to 83.0% in 2007. Quality of elaborating methods and usage of biostatistics in 6 leading Pakistani medical journals improved from 1998 to 2007, but has still to come up as compared to other western medical journals.

  14. How allele frequency and study design affect association test statistics with misrepresentation errors.

    Science.gov (United States)

    Escott-Price, Valentina; Ghodsi, Mansoureh; Schmidt, Karl Michael

    2014-04-01

    We evaluate the effect of genotyping errors on the type-I error of a general association test based on genotypes, showing that, in the presence of errors in the case and control samples, the test statistic asymptotically follows a scaled non-central $\\chi ^2$ distribution. We give explicit formulae for the scaling factor and non-centrality parameter for the symmetric allele-based genotyping error model and for additive and recessive disease models. They show how genotyping errors can lead to a significantly higher false-positive rate, growing with sample size, compared with the nominal significance levels. The strength of this effect depends very strongly on the population distribution of the genotype, with a pronounced effect in the case of rare alleles, and a great robustness against error in the case of large minor allele frequency. We also show how these results can be used to correct $p$-values.

  15. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    estimator of the IRR. The dierence between the OR and the IRR is re ected in the p-value of the null hypothesis of no exposure eect. For multiple testing scenarios, e.g. in a GWAS, these dierences in estimators imply a change in comparison between the null hypotheses for dierent sampling schemes of controls...... method provides a simple goodness of t test by comparing the observed SFS with the expected SFS under a given model of population size changes. By the use of Monte Carlo estimation the expected time between coalescent events can be estimated and the expected SFS can thereby be evaluated. Using...... the classical chi-square statistics we are able to infer single parameter models. Multiple parameter models, e.g. multiple epochs, are harder to identify. By introducing the inference of population size back in time as an inverse problem, the second procedure applies the theory of smoothing splines to infer...

  16. Designing image segmentation studies: Statistical power, sample size and reference standard quality.

    Science.gov (United States)

    Gibson, Eli; Hu, Yipeng; Huisman, Henkjan J; Barratt, Dean C

    2017-12-01

    Segmentation algorithms are typically evaluated by comparison to an accepted reference standard. The cost of generating accurate reference standards for medical image segmentation can be substantial. Since the study cost and the likelihood of detecting a clinically meaningful difference in accuracy both depend on the size and on the quality of the study reference standard, balancing these trade-offs supports the efficient use of research resources. In this work, we derive a statistical power calculation that enables researchers to estimate the appropriate sample size to detect clinically meaningful differences in segmentation accuracy (i.e. the proportion of voxels matching the reference standard) between two algorithms. Furthermore, we derive a formula to relate reference standard errors to their effect on the sample sizes of studies using lower-quality (but potentially more affordable and practically available) reference standards. The accuracy of the derived sample size formula was estimated through Monte Carlo simulation, demonstrating, with 95% confidence, a predicted statistical power within 4% of simulated values across a range of model parameters. This corresponds to sample size errors of less than 4 subjects and errors in the detectable accuracy difference less than 0.6%. The applicability of the formula to real-world data was assessed using bootstrap resampling simulations for pairs of algorithms from the PROMISE12 prostate MR segmentation challenge data set. The model predicted the simulated power for the majority of algorithm pairs within 4% for simulated experiments using a high-quality reference standard and within 6% for simulated experiments using a low-quality reference standard. A case study, also based on the PROMISE12 data, illustrates using the formulae to evaluate whether to use a lower-quality reference standard in a prostate segmentation study. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Whither Instructional Design and Teacher Training? The Need for Experimental Research

    Science.gov (United States)

    Gropper, George L.

    2015-01-01

    This article takes a contrarian position: an "instructional design" or "teacher training" model, because of the sheer number of its interconnected parameters, is too complex to assess or to compare with other models. Models may not be the way to go just yet. This article recommends instead prior experimental research on limited…

  18. The Impact of the Hawthorne Effect in Experimental Designs in Educational Research. Final Report.

    Science.gov (United States)

    Cook, Desmond L.

    Project objectives included (1) establishing a body of knowledge concerning the role of the Hawthorne effect in experimental designs in educational research, (2) assessing the influence of the Hawthorne effect on educational experiments conducted under varying conditions of control, (3) identifying the major components comprising the effect, and…

  19. Sensitivity-based approach to optimal experimental design in a receptor trafficking and down regulation model

    Science.gov (United States)

    Casey, Fergal; Waterfall, Joshua; Gutenkunst, Ryan; Brown, Kevin; Myers, Christopher; Sethna, James

    2006-03-01

    We apply the ideas of optimal experimental design to systems biology models: minimizing a design criterion based on the average variance of predictions, we suggest new experiments that need to be performed to optimally test a given biological hypothesis. The estimated variance in predictions is derived from the sensitivities of protein and chemical species in the model to changes in reaction rates. The sensitivities also allow us to determine which interactions in the biological network dominate the system behavior. To test the design principles, we have developed a differential equation model incorporating the processes of endocytosis, recycling and degradation of activated epidermal growth factor (EGF) receptor in a mammalian cell line. Recent experimental work has discovered mutant proteins that cause receptor accumulation and a prolonged growth signal. Our model is optimized to fit this mutant experimental data and wild type data for a variety of experimental conditions. Of biological interest is the effect on surface and internalized receptor levels after the overexpression or inactivation of regulator proteins in the network: the optimal design method allows us to fine tune the conditions to best predict the behavior of these unknown components of the system.

  20. Bias Corrections for Standardized Effect Size Estimates Used with Single-Subject Experimental Designs

    Science.gov (United States)

    Ugille, Maaike; Moeyaert, Mariola; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim

    2014-01-01

    A multilevel meta-analysis can combine the results of several single-subject experimental design studies. However, the estimated effects are biased if the effect sizes are standardized and the number of measurement occasions is small. In this study, the authors investigated 4 approaches to correct for this bias. First, the standardized effect…