WorldWideScience

Sample records for significantly improved statistics

  1. Statistically significant relational data mining :

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  2. Statistical Significance for Hierarchical Clustering

    Science.gov (United States)

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  3. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  4. Statistical significance of cis-regulatory modules

    Directory of Open Access Journals (Sweden)

    Smith Andrew D

    2007-01-01

    Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.

  5. The thresholds for statistical and clinical significance

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...

  6. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  7. Swiss solar power statistics 2007 - Significant expansion

    International Nuclear Information System (INIS)

    Hostettler, T.

    2008-01-01

    This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.

  8. Significant Statistics: Viewed with a Contextual Lens

    Science.gov (United States)

    Tait-McCutcheon, Sandi

    2010-01-01

    This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…

  9. Statistical methods for quality improvement

    National Research Council Canada - National Science Library

    Ryan, Thomas P

    2011-01-01

    ...."-TechnometricsThis new edition continues to provide the most current, proven statistical methods for quality control and quality improvementThe use of quantitative methods offers numerous benefits...

  10. Increasing the statistical significance of entanglement detection in experiments.

    Science.gov (United States)

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  11. Increasing the statistical significance of entanglement detection in experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)

    2010-07-01

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.

  12. Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

    Science.gov (United States)

    Gwet, Kilem L.

    2016-01-01

    This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…

  13. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  14. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...

  15. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  16. Your Chi-Square Test Is Statistically Significant: Now What?

    Science.gov (United States)

    Sharpe, Donald

    2015-01-01

    Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…

  17. Statistical Significance and Effect Size: Two Sides of a Coin.

    Science.gov (United States)

    Fan, Xitao

    This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…

  18. Reporting effect sizes as a supplement to statistical significance ...

    African Journals Online (AJOL)

    The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...

  19. Test for the statistical significance of differences between ROC curves

    International Nuclear Information System (INIS)

    Metz, C.E.; Kronman, H.B.

    1979-01-01

    A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions

  20. Improved model for statistical alignment

    Energy Technology Data Exchange (ETDEWEB)

    Miklos, I.; Toroczkai, Z. (Zoltan)

    2001-01-01

    The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.

  1. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  2. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  3. Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies

    International Nuclear Information System (INIS)

    Weber, K.H.

    1993-01-01

    In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de

  4. Systematic reviews of anesthesiologic interventions reported as statistically significant

    DEFF Research Database (Denmark)

    Imberger, Georgina; Gluud, Christian; Boylan, John

    2015-01-01

    statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...

  5. Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.

    Science.gov (United States)

    Kramer, Karen L; Veile, Amanda; Otárola-Castillo, Erik

    2016-01-01

    Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.

  6. Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.

    Directory of Open Access Journals (Sweden)

    Karen L Kramer

    Full Text Available Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1 as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2 competition from young siblings will negatively impact child growth during the post weaning period; 3 however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.

  7. A tutorial on hunting statistical significance by chasing N

    Directory of Open Access Journals (Sweden)

    Denes Szucs

    2016-09-01

    Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.

  8. Conducting tests for statistically significant differences using forest inventory data

    Science.gov (United States)

    James A. Westfall; Scott A. Pugh; John W. Coulston

    2013-01-01

    Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...

  9. Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks

    Science.gov (United States)

    2016-04-26

    Systems, Statistics & Management Science, University of Alabama, USA. 1 DISTRIBUTION A: Distribution approved for public release. Contents 1 Summary 5...13 5 Application to Real Networks 18 5.1 2012 FBS Football Schedule Network... football schedule network. . . . . . . . . . . . . . . . . . . . . . 21 14 Stem plot of degree-ordered vertices versus the degree for college football

  10. Improving Fuel Statistics for Danish Aviation

    DEFF Research Database (Denmark)

    Winther, M.

    This report contains fuel use figures for Danish civil aviation broken down into domestic and international numbers from 1985 to 2000, using a refined fuel split procedure and official fuel sale totals. The results from two different models are used. The NERI (National Environmental Research...... Institute) model estimates the fuel use per flight for all flights leaving Danish airports in 1998, while the annual Danish CORINAIR inventories are based on improved LTO/aircraft type statistics. A time series of fuel use from 1985 to 2000 is also shown for flights between Denmark and Greenland/the Faroe...... Islands, obtained with the NERI model. In addition a complete overview of the aviation fuel use from the two latter areas is given, based on fuel sale information from Statistics Greenland and Statistics Faroe Islands, and fuel use data from airline companies. The fuel use figures are presented on a level...

  11. Static quarks with improved statistical precision

    International Nuclear Information System (INIS)

    Della Morte, M.; Duerr, S.; Molke, H.; Heitger, J.

    2003-09-01

    We present a numerical study for different discretisations of the static action, concerning cut-off effects and the growth of statistical errors with Euclidean time. An error reduction by an order of magnitude can be obtained with respect to the Eichten-Hill action, for time separations up to 2 fm, keeping discretization errors small. The best actions lead to a big improvement on the precision of the quark mass M b and F B s in the static approximation. (orig.)

  12. After statistics reform : Should we still teach significance testing?

    NARCIS (Netherlands)

    A. Hak (Tony)

    2014-01-01

    textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in

  13. Carotid endarterectomy significantly improves postoperative laryngeal sensitivity.

    Science.gov (United States)

    Hammer, Georg Philipp; Tomazic, Peter Valentin; Vasicek, Sarah; Graupp, Matthias; Gugatschka, Markus; Baumann, Anneliese; Konstantiniuk, Peter; Koter, Stephan Herwig

    2016-11-01

    sensory threshold on the operated-on side (6.08 ± 2.02 mm Hg) decreased significantly at the 6-week follow-up, even in relation to the preoperative measure (P = .022). With the exception of one patient with permanent unilateral vocal fold immobility, no signs of nerve injury were detected. In accordance with previous reports, injuries to the recurrent laryngeal nerve during CEA seem to be rare. In most patients, postoperative symptoms (globus, dysphagia, dysphonia) and signs fade within a few weeks without any specific therapeutic intervention. This study shows an improved long-term postoperative superior laryngeal nerve function with regard to laryngopharyngeal sensitivity. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  15. Distinguishing between statistical significance and practical/clinical meaningfulness using statistical inference.

    Science.gov (United States)

    Wilkinson, Michael

    2014-03-01

    Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.

  16. Confidence intervals permit, but don't guarantee, better inference than statistical significance testing

    Directory of Open Access Journals (Sweden)

    Melissa Coulson

    2010-07-01

    Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.

  17. Intensive inpatient treatment for bulimia nervosa: Statistical and clinical significance of symptom changes.

    Science.gov (United States)

    Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich

    2018-03-01

    This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.

  18. Improving Statistical Literacy in Schools in Australia

    OpenAIRE

    Trewin, Dennis

    2005-01-01

    We live in the information age. Statistical thinking is a life skill that all Australian children should have. The Statistical Society of Australia (SSAI) and the Australian Bureau of Statistics (ABS) have been working on a strategy to ensure Australian school children acquire a sufficient understanding and appreciation of how data can be acquired and used so they can make informed judgements in their daily lives, as children and then as adults. There is another motive for our work i...

  19. Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.

    Science.gov (United States)

    Breunig, Nancy A.

    Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…

  20. Testing statistical significance scores of sequence comparison methods with structure similarity

    Directory of Open Access Journals (Sweden)

    Leunissen Jack AM

    2006-10-01

    Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.

  1. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  2. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  3. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  4. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  5. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.

    2014-01-01

    . Statistical fusion resulted in statistically indistinguishable performance from self-assessed weighted voting. The authors developed a new theoretical basis for using self-assessed performance in the framework of statistical fusion and demonstrated that the combined sources of information (both statistical assessment and self-assessment) yielded statistically significant improvement over the methods considered separately. Conclusions: The authors present the first systematic characterization of self-assessed performance in manual labeling. The authors demonstrate that self-assessment and statistical fusion yield similar, but complementary, benefits for label fusion. Finally, the authors present a new theoretical basis for combining self-assessments with statistical label fusion

  6. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    . Statistical fusion resulted in statistically indistinguishable performance from self-assessed weighted voting. The authors developed a new theoretical basis for using self-assessed performance in the framework of statistical fusion and demonstrated that the combined sources of information (both statistical assessment and self-assessment) yielded statistically significant improvement over the methods considered separately. Conclusions: The authors present the first systematic characterization of self-assessed performance in manual labeling. The authors demonstrate that self-assessment and statistical fusion yield similar, but complementary, benefits for label fusion. Finally, the authors present a new theoretical basis for combining self-assessments with statistical label fusion.

  7. THE FLUORBOARD A STATISTICALLY BASED DASHBOARD METHOD FOR IMPROVING SAFETY

    International Nuclear Information System (INIS)

    PREVETTE, S.S.

    2005-01-01

    The FluorBoard is a statistically based dashboard method for improving safety. Fluor Hanford has achieved significant safety improvements--including more than a 80% reduction in OSHA cases per 200,000 hours, during its work at the US Department of Energy's Hanford Site in Washington state. The massive project on the former nuclear materials production site is considered one of the largest environmental cleanup projects in the world. Fluor Hanford's safety improvements were achieved by a committed partnering of workers, managers, and statistical methodology. Safety achievements at the site have been due to a systematic approach to safety. This includes excellent cooperation between the field workers, the safety professionals, and management through OSHA Voluntary Protection Program principles. Fluor corporate values are centered around safety, and safety excellence is important for every manager in every project. In addition, Fluor Hanford has utilized a rigorous approach to using its safety statistics, based upon Dr. Shewhart's control charts, and Dr. Deming's management and quality methods

  8. Statistical reviewers improve reporting in biomedical articles: a randomized trial.

    Directory of Open Access Journals (Sweden)

    Erik Cobo

    2007-03-01

    Full Text Available Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both.Interventions were defined as 1 the addition of a statistical reviewer to the clinical peer review process, and 2 suggesting reporting guidelines to reviewers; with "no statistical expert" and "no checklist" as controls. The two interventions were crossed in a 2x2 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others. Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3% of the 115 included papers were interventions, 46 (40.0% were longitudinal designs, 28 (24.3% cross-sectional and 20 (17.4% others. The 16 (13.9% rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6-24.4. The effect of suggesting a guideline to the

  9. Improving cerebellar segmentation with statistical fusion

    Science.gov (United States)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  10. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    Science.gov (United States)

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  11. Measuring individual significant change on the Beck Depression Inventory-II through IRT-based statistics.

    NARCIS (Netherlands)

    Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.

    2013-01-01

    Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual

  12. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  13. Health significance and statistical uncertainty. The value of P-value.

    Science.gov (United States)

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  14. Statistical vs. Economic Significance in Economics and Econometrics: Further comments on McCloskey & Ziliak

    DEFF Research Database (Denmark)

    Engsted, Tom

    I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...

  15. Statistical significant changes in ground thermal conditions of alpine Austria during the last decade

    Science.gov (United States)

    Kellerer-Pirklbauer, Andreas

    2016-04-01

    Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter

  16. Codon Deviation Coefficient: A novel measure for estimating codon usage bias and its statistical significance

    KAUST Repository

    Zhang, Zhang

    2012-03-22

    Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.

  17. Codon Deviation Coefficient: a novel measure for estimating codon usage bias and its statistical significance

    Directory of Open Access Journals (Sweden)

    Zhang Zhang

    2012-03-01

    Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.

  18. Inhaler Reminders Significantly Improve Asthma Patients' Use of Controller Medications

    Science.gov (United States)

    ... controller medications Share | Inhaler reminders significantly improve asthma patients’ use of controller medications Published Online: July 22, ... the burden and risk of asthma, but many patients do not use them regularly. This poor adherence ...

  19. Significant improvement in the thermal annealing process of optical resonators

    Science.gov (United States)

    Salzenstein, Patrice; Zarubin, Mikhail

    2017-05-01

    Thermal annealing performed during process improves the quality of the roughness of optical resonators reducing stresses at the periphery of their surface thus allowing higher Q-factors. After a preliminary realization, the design of the oven and the electronic method were significantly improved thanks to nichrome resistant alloy wires and chopped basalt fibers for thermal isolation during the annealing process. Q-factors can then be improved.

  20. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  1. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  2. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.

    Science.gov (United States)

    Farrell, Mary Beth

    2018-06-01

    This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being

  3. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil

    2011-05-11

    Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.

  4. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  5. Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.

    Science.gov (United States)

    Deegear, James

    This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…

  6. Is statistical significance clinically important?--A guide to judge the clinical relevance of study findings

    NARCIS (Netherlands)

    Sierevelt, Inger N.; van Oldenrijk, Jakob; Poolman, Rudolf W.

    2007-01-01

    In this paper we describe several issues that influence the reporting of statistical significance in relation to clinical importance, since misinterpretation of p values is a common issue in orthopaedic literature. Orthopaedic research is tormented by the risks of false-positive (type I error) and

  7. Statistical Significance of the Contribution of Variables to the PCA Solution: An Alternative Permutation Strategy

    Science.gov (United States)

    Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.

    2011-01-01

    In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…

  8. Statistical significance versus clinical importance: trials on exercise therapy for chronic low back pain as example.

    NARCIS (Netherlands)

    van Tulder, M.W.; Malmivaara, A.; Hayden, J.; Koes, B.

    2007-01-01

    STUDY DESIGN. Critical appraisal of the literature. OBJECIVES. The objective of this study was to assess if results of back pain trials are statistically significant and clinically important. SUMMARY OF BACKGROUND DATA. There seems to be a discrepancy between conclusions reported by authors and

  9. P-Value, a true test of statistical significance? a cautionary note ...

    African Journals Online (AJOL)

    While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...

  10. From classroom to online teaching: experiences in improving statistics education

    Directory of Open Access Journals (Sweden)

    Anne Porter

    2003-01-01

    Full Text Available This study used reflective practitioner methodology to investigate how to improve the quality of statistical education. During the study, this methodology, curricula, pedagogical practices, assessment and a framework for learning to learn statistics were all developed as means of improving the quality of statistical education. Also documented was the move from being a classroom teacher of statistics to a teacher who is developing learning resources for online delivery to students. For a classroom teacher, flexible delivery has meant drawing on the sights, sounds, movement, quiet and live shows. By contrast, the online teacher feels the constraints of translating activity based programs to technologically based programs. As more students have chosen to rely on online materials rather than classroom activities, the focus of improving quality has been extended to the enrichment of online resources, so that the learning experience is not second to that of the classroom.

  11. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  12. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  13. Thresholds for statistical and clinical significance in systematic reviews with meta-analytic methods

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...

  14. Training directionally selective motion pathways can significantly improve reading efficiency

    Science.gov (United States)

    Lawton, Teri

    2004-06-01

    This study examined whether perceptual learning at early levels of visual processing would facilitate learning at higher levels of processing. This was examined by determining whether training the motion pathways by practicing leftright movement discrimination, as found previously, would improve the reading skills of inefficient readers significantly more than another computer game, a word discrimination game, or the reading program offered by the school. This controlled validation study found that practicing left-right movement discrimination 5-10 minutes twice a week (rapidly) for 15 weeks doubled reading fluency, and significantly improved all reading skills by more than one grade level, whereas inefficient readers in the control groups barely improved on these reading skills. In contrast to previous studies of perceptual learning, these experiments show that perceptual learning of direction discrimination significantly improved reading skills determined at higher levels of cognitive processing, thereby being generalized to a new task. The deficits in reading performance and attentional focus experienced by the person who struggles when reading are suggested to result from an information overload, resulting from timing deficits in the direction-selectivity network proposed by Russell De Valois et al. (2000), that following practice on direction discrimination goes away. This study found that practicing direction discrimination rapidly transitions the inefficient 7-year-old reader to an efficient reader.

  15. Educational Statistics and School Improvement. Statistics and the Federal Role in Education.

    Science.gov (United States)

    Hawley, Willis D.

    This paper focuses on how educational statistics might better serve the quest for educational improvement in elementary and secondary schools. A model for conceptualizing the sources and processes of school productivity is presented. The Learning Productivity Model suggests that school outcomes are the consequence of the interaction of five…

  16. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    Science.gov (United States)

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  17. Significant Improvement of Catalytic Efficiencies in Ionic Liquids

    International Nuclear Information System (INIS)

    Song, Choong Eui; Yoon, Mi Young; Choi, Doo Seong

    2005-01-01

    The use of ionic liquids as reaction media can confer many advantages upon catalytic reactions over reactions in organic solvents. In ionic liquids, catalysts having polar or ionic character can easily be immobilized without additional structural modification and thus the ionic solutions containing the catalyst can easily be separated from the reagents and reaction products, and then, be reused. More interestingly, switching from an organic solvent to an ionic liquid often results in a significant improvement in catalytic performance (e.g., rate acceleration, (enantio)selectivity improvement and an increase in catalyst stability). In this review, some recent interesting results which can nicely demonstrate these positive 'ionic liquid effect' on catalysis are discussed

  18. Statistical significance estimation of a signal within the GooFit framework on GPUs

    Directory of Open Access Journals (Sweden)

    Cristella Leonardo

    2017-01-01

    Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  19. Statistical significance of theoretical predictions: A new dimension in nuclear structure theories (I)

    International Nuclear Information System (INIS)

    DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G

    2011-01-01

    In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.

  20. Bedtime Blood Pressure Chronotherapy Significantly Improves Hypertension Management.

    Science.gov (United States)

    Hermida, Ramón C; Ayala, Diana E; Fernández, José R; Mojón, Artemio; Crespo, Juan J; Ríos, María T; Smolensky, Michael H

    2017-10-01

    Consistent evidence of numerous studies substantiates the asleep blood pressure (BP) mean derived from ambulatory BP monitoring (ABPM) is both an independent and a stronger predictor of cardiovascular disease (CVD) risk than are daytime clinic BP measurements or the ABPM-determined awake or 24-hour BP means. Hence, cost-effective adequate control of sleep-time BP is of marked clinical relevance. Ingestion time, according to circadian rhythms, of hypertension medications of 6 different classes and their combinations significantly improves BP control, particularly sleep-time BP, and reduces adverse effects. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. [Improvement of Phi bodies stain and its clinical significance].

    Science.gov (United States)

    Gong, Xu-Bo; Lu, Xing-Guo; Yan, Li-Juan; Xiao, Xi-Bin; Wu, Dong; Xu, Gen-Bo; Zhang, Xiao-Hong; Zhao, Xiao-Ying

    2009-02-01

    The aim of this study was to improve the dyeing method of hydroperoxidase (HPO), to analyze the morphologic features of Phi bodies and to evaluate the clinical application of this method. 128 bone marrow or peripheral blood smears from patients with myeloid and lymphoid malignancies were stained by improved HPO staining. The Phi bodies were observed with detection rate of Phi bodies in different leukemias. 69 acute myeloid leukemia (AML) specimens were chosen randomly, the positive rate and the number of Phi bodies between the improved HPO and POX stain based on the same substrate of 3, 3'diaminobenzidine were compared. The results showed that the shape of bundle-like Phi bodies was variable, long or short. while the nubbly Phi bodies often presented oval and smooth. Club-like Phi bodies were found in M(3). The detection rates of bundle-like Phi bodies in AML M(1)-M(5) were 42.9% (6/14), 83.3% (15/18), 92.0% (23/25), 52.3% (11/21), 33.3% (5/15) respectively, and those of nubbly Phi bodies were 28.6% (4/14), 66.7% (12/18), 11.1% (3/25), 33.3% (7/21), 20.0% (3/15) respectively. The detection rate of bundle-like Phi bodies in M(3) was significantly higher than that in (M(1) + M(2)) or (M(4) + M(5)) groups. The detection rate of nubbly Phi bodies in (M(1) + M(2)) group was higher than that in M(3) group. In conclusion, after improvement of staining method, the HPO stain becomes simple, the detection rate of Phi bodies is higher than that by the previous method, the positive granules are more obvious, and the results become stable. This improved method plays an important role in differentiating AML from ALL, subtyping AML, and evaluating the therapeutic results.

  2. Confidence Intervals: From tests of statistical significance to confidence intervals, range hypotheses and substantial effects

    Directory of Open Access Journals (Sweden)

    Dominic Beaulieu-Prévost

    2006-03-01

    Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.

  3. Examining reproducibility in psychology : A hybrid method for combining a statistically significant original study and a replication

    NARCIS (Netherlands)

    Van Aert, R.C.M.; Van Assen, M.A.L.M.

    2018-01-01

    The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter

  4. A Note on Comparing the Power of Test Statistics at Low Significance Levels.

    Science.gov (United States)

    Morris, Nathan; Elston, Robert

    2011-01-01

    It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.

  5. Statistically significant faunal differences among Middle Ordovician age, Chickamauga Group bryozoan bioherms, central Alabama

    Energy Technology Data Exchange (ETDEWEB)

    Crow, C.J.

    1985-01-01

    Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.

  6. Ceramic Composite Intermediate Temperature Stress-Rupture Properties Improved Significantly

    Science.gov (United States)

    Morscher, Gregory N.; Hurst, Janet B.

    2002-01-01

    Silicon carbide (SiC) composites are considered to be potential materials for future aircraft engine parts such as combustor liners. It is envisioned that on the hot side (inner surface) of the combustor liner, composites will have to withstand temperatures in excess of 1200 C for thousands of hours in oxidizing environments. This is a severe condition; however, an equally severe, if not more detrimental, condition exists on the cold side (outer surface) of the combustor liner. Here, the temperatures are expected to be on the order of 800 to 1000 C under high tensile stress because of thermal gradients and attachment of the combustor liner to the engine frame (the hot side will be under compressive stress, a less severe stress-state for ceramics). Since these composites are not oxides, they oxidize. The worst form of oxidation for strength reduction occurs at these intermediate temperatures, where the boron nitride (BN) interphase oxidizes first, which causes the formation of a glass layer that strongly bonds the fibers to the matrix. When the fibers strongly bond to the matrix or to one another, the composite loses toughness and strength and becomes brittle. To increase the intermediate temperature stress-rupture properties, researchers must modify the BN interphase. With the support of the Ultra-Efficient Engine Technology (UEET) Program, significant improvements were made as state-of-the-art SiC/SiC composites were developed during the Enabling Propulsion Materials (EPM) program. Three approaches were found to improve the intermediate-temperature stress-rupture properties: fiber-spreading, high-temperature silicon- (Si) doped boron nitride (BN), and outside-debonding BN.

  7. Estimates of statistical significance for comparison of individual positions in multiple sequence alignments

    Directory of Open Access Journals (Sweden)

    Sadreyev Ruslan I

    2004-08-01

    Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.

  8. Determining coding CpG islands by identifying regions significant for pattern statistics on Markov chains.

    Science.gov (United States)

    Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior

    2011-09-23

    Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.

  9. Improved statistical confirmation of margins for setpoints and transients

    International Nuclear Information System (INIS)

    Nutt, W.T.

    2001-01-01

    Framatome ANP Richland, Inc. has developed an integrated, automated, statistical methodology for Pressurized Water Reactors (PWRs). Margins for transients and calculated trips are confirmed using several new applications of probability theory. The methods used for combining statistics reduces the conservatisms inherent in conventional methods and avoids the numerical limitations and time constraints imposed by Monte Carlo techniques. The new methodology represents the state of the art in the treatment of uncertainties for reactor protection systems. It all but eliminates concerns with the calculated trips for PWRs and by improving the margin for all transients will allow for far more aggressive peaking limits and fuel management schemes. The automated nature of the bulk of this process saves Framatome ANP time and effort, minimizes the potential for errors and makes the analysis for all cycles and plants consistent. The enhanced margins remove analytical limitations from the customer and allow for more economical operation of the plant. (authors)

  10. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Advanced statistics to improve the physical interpretation of atomization processes

    International Nuclear Information System (INIS)

    Panão, Miguel R.O.; Radu, Lucian

    2013-01-01

    Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes

  12. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    Science.gov (United States)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  13. Can We Use Polya’s Method to Improve Students’ Performance in the Statistics Classes?

    Directory of Open Access Journals (Sweden)

    Indika Wickramasinghe

    2015-01-01

    Full Text Available In this study, Polya’s problem-solving method is introduced in a statistics class in an effort to enhance students’ performance. Teaching the method was applied to one of the two introductory-level statistics classes taught by the same instructor, and a comparison was made between the performances in the two classes. The results indicate there was a significant improvement of the students’ performance in the class in which Polya’s method was introduced.

  14. Indirectional statistics and the significance of an asymmetry discovered by Birch

    International Nuclear Information System (INIS)

    Kendall, D.G.; Young, G.A.

    1984-01-01

    Birch (1982, Nature, 298, 451) reported an apparent 'statistical asymmetry of the Universe'. The authors here develop 'indirectional analysis' as a technique for investigating statistical effects of this kind and conclude that the reported effect (whatever may be its origin) is strongly supported by the observations. The estimated pole of the asymmetry is at RA 13h 30m, Dec. -37deg. The angular error in its estimation is unlikely to exceed 20-30deg. (author)

  15. Confounding and Statistical Significance of Indirect Effects: Childhood Adversity, Education, Smoking, and Anxious and Depressive Symptomatology

    Directory of Open Access Journals (Sweden)

    Mashhood Ahmed Sheikh

    2017-08-01

    mediate the association between childhood adversity and ADS in adulthood. However, when education was excluded as a mediator-response confounding variable, the indirect effect of childhood adversity on ADS in adulthood was statistically significant (p < 0.05. This study shows that a careful inclusion of potential confounding variables is important when assessing mediation.

  16. Methylphenidate significantly improves declarative memory functioning of adults with ADHD.

    NARCIS (Netherlands)

    Verster, J.C.; Bekker, E.M.; Kooij, J.J.; Buitelaar, J.K.; Verbaten, M.N.; Volkerts, E.R.; Olivier, B.

    2010-01-01

    BACKGROUND: Declarative memory deficits are common in untreated adults with attention-deficit hyperactivity disorder (ADHD), but limited evidence exists to support improvement after treatment with methylphenidate. The objective of this study was to examine the effects of methylphenidate on memory

  17. Improving GEFS Weather Forecasts for Indian Monsoon with Statistical Downscaling

    Science.gov (United States)

    Agrawal, Ankita; Salvi, Kaustubh; Ghosh, Subimal

    2014-05-01

    established between the principal components and observed rainfall over training period and predictions are obtained for testing period. The validations show high improvements in correlation coefficient between observed and predicted data (0.25 to 0.55). The results speak in favour of statistical downscaling methodology which shows the capability to reduce the gap between observed data and predictions. A detailed study is required to be carried out by applying different downscaling techniques to quantify the improvements in predictions.

  18. Evaluation of significantly modified water bodies in Vojvodina by using multivariate statistical techniques

    Directory of Open Access Journals (Sweden)

    Vujović Svetlana R.

    2013-01-01

    Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an

  19. Accelerator driven reactors, - the significance of the energy distribution of spallation neutrons on the neutron statistics

    Energy Technology Data Exchange (ETDEWEB)

    Fhager, V

    2000-01-01

    In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy

  20. Accelerator driven reactors, - the significance of the energy distribution of spallation neutrons on the neutron statistics

    International Nuclear Information System (INIS)

    Fhager, V.

    2000-01-01

    In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy

  1. Creating a Middle Grades Environment that Significantly Improves Student Achievement

    Science.gov (United States)

    L'Esperance, Mark E.; Lenker, Ethan; Bullock, Ann; Lockamy, Becky; Mason, Cathy

    2013-01-01

    This article offers an overview of the framework that Sampson County Public Schools (North Carolina) used to critically reflect on the current state of their middle grades schools. The article also highlights the changes that resulted from the district-wide analysis and the ways in which these changes led to a significant increase in the academic…

  2. Robust statistical methods for significance evaluation and applications in cancer driver detection and biomarker discovery

    DEFF Research Database (Denmark)

    Madsen, Tobias

    2017-01-01

    In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...

  3. Low-dose vaporized cannabis significantly improves neuropathic pain.

    Science.gov (United States)

    Wilsey, Barth; Marcotte, Thomas; Deutsch, Reena; Gouaux, Ben; Sakai, Staci; Donaghe, Haylee

    2013-02-01

    We conducted a double-blind, placebo-controlled, crossover study evaluating the analgesic efficacy of vaporized cannabis in subjects, the majority of whom were experiencing neuropathic pain despite traditional treatment. Thirty-nine patients with central and peripheral neuropathic pain underwent a standardized procedure for inhaling medium-dose (3.53%), low-dose (1.29%), or placebo cannabis with the primary outcome being visual analog scale pain intensity. Psychoactive side effects and neuropsychological performance were also evaluated. Mixed-effects regression models demonstrated an analgesic response to vaporized cannabis. There was no significant difference between the 2 active dose groups' results (P > .7). The number needed to treat (NNT) to achieve 30% pain reduction was 3.2 for placebo versus low-dose, 2.9 for placebo versus medium-dose, and 25 for medium- versus low-dose. As these NNTs are comparable to those of traditional neuropathic pain medications, cannabis has analgesic efficacy with the low dose being as effective a pain reliever as the medium dose. Psychoactive effects were minimal and well tolerated, and neuropsychological effects were of limited duration and readily reversible within 1 to 2 hours. Vaporized cannabis, even at low doses, may present an effective option for patients with treatment-resistant neuropathic pain. The analgesia obtained from a low dose of delta-9-tetrahydrocannabinol (1.29%) in patients, most of whom were experiencing neuropathic pain despite conventional treatments, is a clinically significant outcome. In general, the effect sizes on cognitive testing were consistent with this minimal dose. As a result, one might not anticipate a significant impact on daily functioning. Published by Elsevier Inc.

  4. Low Dose Vaporized Cannabis Significantly Improves Neuropathic Pain

    Science.gov (United States)

    Wilsey, Barth; Marcotte, Thomas D.; Deutsch, Reena; Gouaux, Ben; Sakai, Staci; Donaghe, Haylee

    2013-01-01

    We conducted a double-blind, placebo-controlled, crossover study evaluating the analgesic efficacy of vaporized cannabis in subjects, the majority of whom were experiencing neuropathic pain despite traditional treatment. Thirty-nine patients with central and peripheral neuropathic pain underwent a standardized procedure for inhaling either medium dose (3.53%), low dose (1.29%), or placebo cannabis with the primary outcome being VAS pain intensity. Psychoactive side-effects, and neuropsychological performance were also evaluated. Mixed effects regression models demonstrated an analgesic response to vaporized cannabis. There was no significant difference between the two active dose groups’ results (p>0.7). The number needed to treat (NNT) to achieve 30% pain reduction was 3.2 for placebo vs. low dose, 2.9 for placebo vs. medium dose, and 25 for medium vs. low dose. As these NNT are comparable to those of traditional neuropathic pain medications, cannabis has analgesic efficacy with the low dose being, for all intents and purposes, as effective a pain reliever as the medium dose. Psychoactive effects were minimal and well-tolerated, and neuropsychological effects were of limited duration and readily reversible within 1–2 hours. Vaporized cannabis, even at low doses, may present an effective option for patients with treatment-resistant neuropathic pain. PMID:23237736

  5. Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads

    Directory of Open Access Journals (Sweden)

    Erinijus Getautis

    2011-04-01

    Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian

  6. Statistical determination of significant curved I-girder bridge seismic response parameters

    Science.gov (United States)

    Seo, Junwon

    2013-06-01

    Curved steel bridges are commonly used at interchanges in transportation networks and more of these structures continue to be designed and built in the United States. Though the use of these bridges continues to increase in locations that experience high seismicity, the effects of curvature and other parameters on their seismic behaviors have been neglected in current risk assessment tools. These tools can evaluate the seismic vulnerability of a transportation network using fragility curves. One critical component of fragility curve development for curved steel bridges is the completion of sensitivity analyses that help identify influential parameters related to their seismic response. In this study, an accessible inventory of existing curved steel girder bridges located primarily in the Mid-Atlantic United States (MAUS) was used to establish statistical characteristics used as inputs for a seismic sensitivity study. Critical seismic response quantities were captured using 3D nonlinear finite element models. Influential parameters from these quantities were identified using statistical tools that incorporate experimental Plackett-Burman Design (PBD), which included Pareto optimal plots and prediction profiler techniques. The findings revealed that the potential variation in the influential parameters included number of spans, radius of curvature, maximum span length, girder spacing, and cross-frame spacing. These parameters showed varying levels of influence on the critical bridge response.

  7. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  8. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Science.gov (United States)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  9. Development of free statistical software enabling researchers to calculate confidence levels, clinical significance curves and risk-benefit contours

    International Nuclear Information System (INIS)

    Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.

    2003-01-01

    Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any

  10. An improved Fuzzy Kappa statistic that accounts for spatial autocorrelation

    NARCIS (Netherlands)

    Hagen - Zanker, A.H.

    2009-01-01

    The Fuzzy Kappa statistic expresses the agreement between two categorical raster maps. The statistic goes beyond cell-by-cell comparison and gives partial credit to cells based on the categories found in the neighborhood. When matching categories are found at shorter distances the agreement is

  11. The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data

    Science.gov (United States)

    Gregory, Kelvin

    2006-01-01

    The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…

  12. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  13. Childhood-compared to adolescent-onset bipolar disorder has more statistically significant clinical correlates.

    Science.gov (United States)

    Holtzman, Jessica N; Miller, Shefali; Hooshmand, Farnaz; Wang, Po W; Chang, Kiki D; Hill, Shelley J; Rasgon, Natalie L; Ketter, Terence A

    2015-07-01

    The strengths and limitations of considering childhood-and adolescent-onset bipolar disorder (BD) separately versus together remain to be established. We assessed this issue. BD patients referred to the Stanford Bipolar Disorder Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD Affective Disorders Evaluation. Patients with childhood- and adolescent-onset were compared to those with adult-onset for 7 unfavorable bipolar illness characteristics with replicated associations with early-onset patients. Among 502 BD outpatients, those with childhood- (adolescent- (13-18 years, N=218) onset had significantly higher rates for 4/7 unfavorable illness characteristics, including lifetime comorbid anxiety disorder, at least ten lifetime mood episodes, lifetime alcohol use disorder, and prior suicide attempt, than those with adult-onset (>18 years, N=174). Childhood- but not adolescent-onset BD patients also had significantly higher rates of first-degree relative with mood disorder, lifetime substance use disorder, and rapid cycling in the prior year. Patients with pooled childhood/adolescent - compared to adult-onset had significantly higher rates for 5/7 of these unfavorable illness characteristics, while patients with childhood- compared to adolescent-onset had significantly higher rates for 4/7 of these unfavorable illness characteristics. Caucasian, insured, suburban, low substance abuse, American specialty clinic-referred sample limits generalizability. Onset age is based on retrospective recall. Childhood- compared to adolescent-onset BD was more robustly related to unfavorable bipolar illness characteristics, so pooling these groups attenuated such relationships. Further study is warranted to determine the extent to which adolescent-onset BD represents an intermediate phenotype between childhood- and adult-onset BD. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  15. Improving statistical reasoning: theoretical models and practical implications

    National Research Council Canada - National Science Library

    Sedlmeier, Peter

    1999-01-01

    ... in Psychology? 206 References 216 Author Index 230 Subject Index 235 v PrefacePreface Statistical literacy, the art of drawing reasonable inferences from an abundance of numbers provided daily by...

  16. Improved custom statistics visualization for CA Performance Center data

    CERN Document Server

    Talevi, Iacopo

    2017-01-01

    The main goal of my project is to understand and experiment the possibilities that CA Performance Center (CA PC) offers for creating custom applications to display stored information through interesting visual means, such as maps. In particular, I have re-written some of the network statistics web pages in order to fetch data from new statistics modules in CA PC, which has its own API, and stop using the RRD data.

  17. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  18. Statistical Significance of the Maximum Hardness Principle Applied to Some Selected Chemical Reactions.

    Science.gov (United States)

    Saha, Ranajit; Pan, Sudip; Chattaraj, Pratim K

    2016-11-05

    The validity of the maximum hardness principle (MHP) is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd) and LC-BLYP/6-311++G(2df,3pd) (def2-QZVP for iodine and mercury) levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.

  19. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  20. Sigsearch: a new term for post hoc unplanned search for statistically significant relationships with the intent to create publishable findings.

    Science.gov (United States)

    Hashim, Muhammad Jawad

    2010-09-01

    Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.

  1. Improved air ventilation rate estimation based on a statistical model

    International Nuclear Information System (INIS)

    Brabec, M.; Jilek, K.

    2004-01-01

    A new approach to air ventilation rate estimation from CO measurement data is presented. The approach is based on a state-space dynamic statistical model, allowing for quick and efficient estimation. Underlying computations are based on Kalman filtering, whose practical software implementation is rather easy. The key property is the flexibility of the model, allowing various artificial regimens of CO level manipulation to be treated. The model is semi-parametric in nature and can efficiently handle time-varying ventilation rate. This is a major advantage, compared to some of the methods which are currently in practical use. After a formal introduction of the statistical model, its performance is demonstrated on real data from routine measurements. It is shown how the approach can be utilized in a more complex situation of major practical relevance, when time-varying air ventilation rate and radon entry rate are to be estimated simultaneously from concurrent radon and CO measurements

  2. ClusterSignificance: A bioconductor package facilitating statistical analysis of class cluster separations in dimensionality reduced data

    DEFF Research Database (Denmark)

    Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per

    2017-01-01

    , e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....

  3. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  4. Using robust statistics to improve neutron activation analysis results

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G.

    2011-01-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  5. A history of industrial statistics and quality and efficiency improvement

    NARCIS (Netherlands)

    de Mast, J.; Coleman, S.; Greenfield, T.; Stewardson, D.; Montgomery, D.C.

    2008-01-01

    The twentieth century witnessed incredible increases in product quality, while in the same period product priced dropped dramatically. These important improvements in quality and efficiency in industry were the result of innovations in management and engineering. But these developments were

  6. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  9. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data.

    Science.gov (United States)

    Kim, Sung-Min; Choi, Yosoon

    2017-06-18

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  10. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data

    Directory of Open Access Journals (Sweden)

    Sung-Min Kim

    2017-06-01

    Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  11. Thermosensitive Hydrogel Mask Significantly Improves Skin Moisture and Skin Tone; Bilateral Clinical Trial

    Directory of Open Access Journals (Sweden)

    Anna Quattrone

    2017-06-01

    Full Text Available Objective: A temperature-sensitive state-changing hydrogel mask was used in this study. Once it comes into contact with the skin and reaches the body temperature, it uniformly and quickly releases the active compounds, which possess moisturizing, anti-oxidant, anti-inflammatory and regenerative properties. Methods: An open label clinical trial was conducted to evaluate the effects of the test product on skin hydration, skin tone and skin ageing. Subjects applied the product to one side of their face and underwent Corneometer® and Chromameter measurements, Visual assessment of facial skin ageing and facial photography. All assessments and Self-Perception Questionnaires (SPQ were performed at baseline, after the first application of the test product and after four applications. Results: After a single treatment we observed an increase in skin moisturisation, an improvement of skin tone/luminosity and a reduction in signs of ageing, all statistically significant. After four applications a further improvement in all measured parameters was recorded. These results were confirmed by the subjects’ own perceptions, as reported in the SPQ both after one and four applications. Conclusion: The hydrogel mask tested in this study is very effective in improving skin hydration, skin radiance and luminosity, in encouraging an even skin tone and in reducing skin pigmentation.

  12. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  13. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  14. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  15. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  16. Improved Statistical Model Of 10.7-cm Solar Radiation

    Science.gov (United States)

    Vedder, John D.; Tabor, Jill L.

    1993-01-01

    Improved mathematical model simulates short-term fluctuations of flux of 10.7-cm-wavelength solar radiation during 91-day averaging period. Called "F10.7 flux", important as measure of solar activity and because it is highly correlated with ultraviolet radiation causing fluctuations in heating and density of upper atmosphere. F10.7 flux easily measureable at surface of Earth.

  17. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Directory of Open Access Journals (Sweden)

    Anita Lindmark

    Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical

  18. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Science.gov (United States)

    Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie

    2016-01-01

    When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when

  19. Intelligent system for statistically significant expertise knowledge on the basis of the model of self-organizing nonequilibrium dissipative system

    Directory of Open Access Journals (Sweden)

    E. A. Tatokchin

    2017-01-01

    Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.

  20. An On-Chip RBC Deformability Checker Significantly Improves Velocity-Deformation Correlation

    Directory of Open Access Journals (Sweden)

    Chia-Hung Dylan Tsai

    2016-10-01

    Full Text Available An on-chip deformability checker is proposed to improve the velocity–deformation correlation for red blood cell (RBC evaluation. RBC deformability has been found related to human diseases, and can be evaluated based on RBC velocity through a microfluidic constriction as in conventional approaches. The correlation between transit velocity and amount of deformation provides statistical information of RBC deformability. However, such correlations are usually only moderate, or even weak, in practical evaluations due to limited range of RBC deformation. To solve this issue, we implemented three constrictions of different width in the proposed checker, so that three different deformation regions can be applied to RBCs. By considering cell responses from the three regions as a whole, we practically extend the range of cell deformation in the evaluation, and could resolve the issue about the limited range of RBC deformation. RBCs from five volunteer subjects were tested using the proposed checker. The results show that the correlation between cell deformation and transit velocity is significantly improved by the proposed deformability checker. The absolute values of the correlation coefficients are increased from an average of 0.54 to 0.92. The effects of cell size, shape and orientation to the evaluation are discussed according to the experimental results. The proposed checker is expected to be useful for RBC evaluation in medical practices.

  1. Radiotherapy is associated with significant improvement in local and regional control in Merkel cell carcinoma

    International Nuclear Information System (INIS)

    Kang, Susan H; Haydu, Lauren E; Goh, Robin Yeong Hong; Fogarty, Gerald B

    2012-01-01

    Merkel cell carcinoma (MCC) is a rare tumour of skin. This study is a retrospective audit of patients with MCC from St Vincent’s and Mater Hospital, Sydney, Australia. The aim of this study was to investigate the influence of radiotherapy (RT) on the local and regional control of MCC lesions and survival of patients with MCC. The data bases in anatomical pathology, RT and surgery. We searched for patients having a diagnosis of MCC between 1996 and 2007. Patient, tumour and treatment characteristics were collected and analysed. Univariate survival analysis of categorical variables was conducted with the Kaplan-Meier method together with the Log-Rank test for statistical significance. Continuous variables were assessed using the Cox regression method. Multivariate analysis was performed for significant univariate results. Sixty seven patients were found. Sixty two who were stage I-III and were treated with radical intent were analysed. 68% were male. The median age was 74 years. Forty-two cases (68%) were stage I or II, and 20 cases (32%) were stage III. For the subset of 42 stage I and II patients, those that had RT to their primary site had a 2-year local recurrence free survival of 89% compared with 36% for patients not receiving RT (p<0.001). The cumulative 2-year regional recurrence free survival for patients having adjuvant regional RT was 84% compared with 43% for patients not receiving this treatment (p<0.001). Immune status at initial surgery was a significant predictor for OS and MCCSS. In a multivariate analysis combining macroscopic size (mm) and immune status at initial surgery, only immune status remained a significant predictor of overall survival (HR=2.096, 95% CI: 1.002-4.385, p=0.049). RT is associated with significant improvement in local and regional control in Merkel cell carcinoma. Immunosuppression is an important factor in overall survival

  2. Geometric Positioning Accuracy Improvement of ZY-3 Satellite Imagery Based on Statistical Learning Theory

    Directory of Open Access Journals (Sweden)

    Niangang Jiao

    2018-05-01

    Full Text Available With the increasing demand for high-resolution remote sensing images for mapping and monitoring the Earth’s environment, geometric positioning accuracy improvement plays a significant role in the image preprocessing step. Based on the statistical learning theory, we propose a new method to improve the geometric positioning accuracy without ground control points (GCPs. Multi-temporal images from the ZY-3 satellite are tested and the bias-compensated rational function model (RFM is applied as the block adjustment model in our experiment. An easy and stable weight strategy and the fast iterative shrinkage-thresholding (FIST algorithm which is widely used in the field of compressive sensing are improved and utilized to define the normal equation matrix and solve it. Then, the residual errors after traditional block adjustment are acquired and tested with the newly proposed inherent error compensation model based on statistical learning theory. The final results indicate that the geometric positioning accuracy of ZY-3 satellite imagery can be improved greatly with our proposed method.

  3. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    Science.gov (United States)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  4. National Emergency Preparedness and Response: Improving for Incidents of National Significance

    National Research Council Canada - National Science Library

    Clayton, Christopher M

    2006-01-01

    The national emergency management system has need of significant improvement in its contingency planning and early consolidation of effort and coordination between federal, state, and local agencies...

  5. The distribution of P-values in medical research articles suggested selective reporting associated with statistical significance.

    Science.gov (United States)

    Perneger, Thomas V; Combescure, Christophe

    2017-07-01

    Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Statistically significant dependence of the Xaa-Pro peptide bond conformation on secondary structure and amino acid sequence

    Directory of Open Access Journals (Sweden)

    Leitner Dietmar

    2005-04-01

    Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.

  7. Improving suicide mortality statistics in Tarragona (Catalonia, Spain) between 2004-2012.

    Science.gov (United States)

    Barbería, Eneko; Gispert, Rosa; Gallo, Belén; Ribas, Gloria; Puigdefàbregas, Anna; Freitas, Adriana; Segú, Elena; Torralba, Pilar; García-Sayago, Francisco; Estarellas, Aina

    2016-07-20

    Monitoring and preventing suicidal behaviour requires, among other data, knowing suicide deaths precisely. They often appear under-reported or misclassified in the official mortality statistics. The aim of this study is to analyse the under-reporting found in the suicide mortality statistics of Tarragona (a province of Catalonia, Spain). The analysis takes into account all suicide deaths that occurred in the Tarragona Area of the Catalan Institute of Legal Medicine and Forensic Sciences (TA-CILMFS) between 2004 and 2012. The sources of information were the death data files of the Catalan Mortality Register, as well as the Autopsies Files of the TA-CILMFS. Suicide rates and socio-demographic profiles were statistically compared between the suicide initially reported and the final one. The mean percentage of non-reported cases in the period was 16.2%, with a minimum percentage of 2.2% in 2005 and a maximum of 26.8% in 2009. The crude mortality rate by suicide rose from 6.6 to 7.9 per 100,000 inhabitants once forensic data were incorporated. Small differences were detected between the socio-demographic profile of the suicide initially reported and the final one. Supplementary information was obtained on the suicide method, which revealed a significant increase in poisoning and suicides involving trains. An exhaustive review of suicide deaths data from forensic sources has led to an improvement in the under-reported statistical information. It also improves the knowledge of the method of suicide and personal characteristics. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  9. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    International Nuclear Information System (INIS)

    Xu, Shiyu; Chen, Ying; Lu, Jianping; Zhou, Otto

    2015-01-01

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications

  10. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu [Department of Electrical and Computer Engineering, Southern Illinois University Carbondale, Carbondale, Illinois 62901 (United States); Lu, Jianping; Zhou, Otto [Department of Physics and Astronomy and Curriculum in Applied Sciences and Engineering, University of North Carolina Chapel Hill, Chapel Hill, North Carolina 27599 (United States)

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.

  11. Frameworks for improvement: clinical audit, the plan-do-study-act cycle and significant event audit.

    Science.gov (United States)

    Gillam, Steve; Siriwardena, A Niroshan

    2013-01-01

    This is the first in a series of articles about quality improvement tools and techniques. We explore common frameworks for improvement, including the model for improvement and its application to clinical audit, plan-do-study-act (PDSA) cycles and significant event analysis (SEA), examining the similarities and differences between these and providing examples of each.

  12. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    International Nuclear Information System (INIS)

    Mousseau, Jeffrey D.; Jansen, John R.; Janke, David H.; Plowman, Catherine M.

    2003-01-01

    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results

  13. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  14. Statistical analysis of textural features for improved classification of oral histopathological images.

    Science.gov (United States)

    Muthu Rama Krishnan, M; Shah, Pratik; Chakraborty, Chandan; Ray, Ajoy K

    2012-04-01

    The objective of this paper is to provide an improved technique, which can assist oncopathologists in correct screening of oral precancerous conditions specially oral submucous fibrosis (OSF) with significant accuracy on the basis of collagen fibres in the sub-epithelial connective tissue. The proposed scheme is composed of collagen fibres segmentation, its textural feature extraction and selection, screening perfomance enhancement under Gaussian transformation and finally classification. In this study, collagen fibres are segmented on R,G,B color channels using back-probagation neural network from 60 normal and 59 OSF histological images followed by histogram specification for reducing the stain intensity variation. Henceforth, textural features of collgen area are extracted using fractal approaches viz., differential box counting and brownian motion curve . Feature selection is done using Kullback-Leibler (KL) divergence criterion and the screening performance is evaluated based on various statistical tests to conform Gaussian nature. Here, the screening performance is enhanced under Gaussian transformation of the non-Gaussian features using hybrid distribution. Moreover, the routine screening is designed based on two statistical classifiers viz., Bayesian classification and support vector machines (SVM) to classify normal and OSF. It is observed that SVM with linear kernel function provides better classification accuracy (91.64%) as compared to Bayesian classifier. The addition of fractal features of collagen under Gaussian transformation improves Bayesian classifier's performance from 80.69% to 90.75%. Results are here studied and discussed.

  15. Introduction of a Journal Excerpt Activity Improves Undergraduate Students' Performance in Statistics

    Science.gov (United States)

    Rabin, Laura A.; Nutter-Upham, Katherine E.

    2010-01-01

    We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…

  16. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration

    NARCIS (Netherlands)

    de Groot, Marius; Vernooij, Meike W.; Klein, Stefan; Ikram, M. Arfan; Vos, Frans M.; Smith, Stephen M.; Niessen, Wiro J.; Andersson, Jesper L. R.

    2013-01-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS

  17. Improving alignment in Tract-based spatial statistics : Evaluation and optimization of image registration

    NARCIS (Netherlands)

    De Groot, M.; Vernooij, M.W.; Klein, S.; Arfan Ikram, M.; Vos, F.M.; Smith, S.M.; Niessen, W.J.; Andersson, J.L.R.

    2013-01-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS

  18. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  19. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  20. Introduction of e-learning in dental radiology reveals significantly improved results in final examination.

    Science.gov (United States)

    Meckfessel, Sandra; Stühmer, Constantin; Bormann, Kai-Hendrik; Kupka, Thomas; Behrends, Marianne; Matthies, Herbert; Vaske, Bernhard; Stiesch, Meike; Gellrich, Nils-Claudius; Rücker, Martin

    2011-01-01

    Because a traditionally instructed dental radiology lecture course is very time-consuming and labour-intensive, online courseware, including an interactive-learning module, was implemented to support the lectures. The purpose of this study was to evaluate the perceptions of students who have worked with web-based courseware as well as the effect on their results in final examinations. Users (n(3+4)=138) had access to the e-program from any networked computer at any time. Two groups (n(3)=71, n(4)=67) had to pass a final exam after using the e-course. Results were compared with two groups (n(1)=42, n(2)=48) who had studied the same content by attending traditional lectures. In addition a survey of the students was statistically evaluated. Most of the respondents reported a positive attitude towards e-learning and would have appreciated more access to computer-assisted instruction. Two years after initiating the e-course the failure rate in the final examination dropped significantly, from 40% to less than 2%. The very positive response to the e-program and improved test scores demonstrated the effectiveness of our e-course as a learning aid. Interactive modules in step with clinical practice provided learning that is not achieved by traditional teaching methods alone. To what extent staff savings are possible is part of a further study. Copyright © 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  1. Macro-indicators of citation impacts of six prolific countries: InCites data and the statistical significance of trends.

    Directory of Open Access Journals (Sweden)

    Lutz Bornmann

    Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.

  2. Improvement of Statistical Typhoon Rainfall Forecasting with ANN-Based Southwest Monsoon Enhancement

    Directory of Open Access Journals (Sweden)

    Tsung-Yi Pan

    2011-01-01

    Full Text Available Typhoon Morakot 2009, with significant southwest monsoon flow, produced a record-breaking rainfall of 2361 mm in 48 hours. This study hopes to improve a statistical typhoon rainfall forecasting method used over the mountain region of Taiwan via an artificial neural network based southwest monsoon enhancement (ANNSME model. Rainfall data collected at two mountain weather stations, ALiShan and YuShan, are analyzed to establish the relation to the southwest monsoon moisture flux which is calculated at a designated sea area southwest of Taiwan. The results show that the moisture flux, with southwest monsoon flow, transported water vapor during the landfall periods of Typhoons Mindulle, Bilis, Fungwong, Kalmaegi, Haitaing and Morakot. Based on the moisture flux, a linear regression is used to identify an effective value of moisture flux as the threshold flux which can enhance mountain rainfall in southwestern Taiwan. In particular, a feedforward neural network (FNN is applied to estimate the residuals from the linear model to the differences between simulated rainfalls by a typhoon rainfall climatology model (TRCM and observations. Consequently, the ANNSME model integrates the effective moisture flux, linear rainfall model and the FNN for residuals. Even with very limited training cases, our results indicate that the ANNSME model is robust and suitable for improvement of TRCM rainfall prediction. The improved prediction of the total rainfall and of the multiple rainfall peaks is important for emergency operation.

  3. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  4. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  5. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  6. Can bias correction and statistical downscaling methods improve the skill of seasonal precipitation forecasts?

    Science.gov (United States)

    Manzanas, R.; Lucero, A.; Weisheimer, A.; Gutiérrez, J. M.

    2018-02-01

    Statistical downscaling methods are popular post-processing tools which are widely used in many sectors to adapt the coarse-resolution biased outputs from global climate simulations to the regional-to-local scale typically required by users. They range from simple and pragmatic Bias Correction (BC) methods, which directly adjust the model outputs of interest (e.g. precipitation) according to the available local observations, to more complex Perfect Prognosis (PP) ones, which indirectly derive local predictions (e.g. precipitation) from appropriate upper-air large-scale model variables (predictors). Statistical downscaling methods have been extensively used and critically assessed in climate change applications; however, their advantages and limitations in seasonal forecasting are not well understood yet. In particular, a key problem in this context is whether they serve to improve the forecast quality/skill of raw model outputs beyond the adjustment of their systematic biases. In this paper we analyze this issue by applying two state-of-the-art BC and two PP methods to downscale precipitation from a multimodel seasonal hindcast in a challenging tropical region, the Philippines. To properly assess the potential added value beyond the reduction of model biases, we consider two validation scores which are not sensitive to changes in the mean (correlation and reliability categories). Our results show that, whereas BC methods maintain or worsen the skill of the raw model forecasts, PP methods can yield significant skill improvement (worsening) in cases for which the large-scale predictor variables considered are better (worse) predicted by the model than precipitation. For instance, PP methods are found to increase (decrease) model reliability in nearly 40% of the stations considered in boreal summer (autumn). Therefore, the choice of a convenient downscaling approach (either BC or PP) depends on the region and the season.

  7. Composition-based statistics and translated nucleotide searches: Improving the TBLASTN module of BLAST

    Directory of Open Access Journals (Sweden)

    Schäffer Alejandro A

    2006-12-01

    Full Text Available Abstract Background TBLASTN is a mode of operation for BLAST that aligns protein sequences to a nucleotide database translated in all six frames. We present the first description of the modern implementation of TBLASTN, focusing on new techniques that were used to implement composition-based statistics for translated nucleotide searches. Composition-based statistics use the composition of the sequences being aligned to generate more accurate E-values, which allows for a more accurate distinction between true and false matches. Until recently, composition-based statistics were available only for protein-protein searches. They are now available as a command line option for recent versions of TBLASTN and as an option for TBLASTN on the NCBI BLAST web server. Results We evaluate the statistical and retrieval accuracy of the E-values reported by a baseline version of TBLASTN and by two variants that use different types of composition-based statistics. To test the statistical accuracy of TBLASTN, we ran 1000 searches using scrambled proteins from the mouse genome and a database of human chromosomes. To test retrieval accuracy, we modernize and adapt to translated searches a test set previously used to evaluate the retrieval accuracy of protein-protein searches. We show that composition-based statistics greatly improve the statistical accuracy of TBLASTN, at a small cost to the retrieval accuracy. Conclusion TBLASTN is widely used, as it is common to wish to compare proteins to chromosomes or to libraries of mRNAs. Composition-based statistics improve the statistical accuracy, and therefore the reliability, of TBLASTN results. The algorithms used by TBLASTN are not widely known, and some of the most important are reported here. The data used to test TBLASTN are available for download and may be useful in other studies of translated search algorithms.

  8. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Yonatan Mengesha Awaj

    2013-03-01

    Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.

  9. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  10. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Strain improvement and statistical optimization as a combined strategy for improving fructosyltransferase production by Aureobasidium pullulans NAC8

    Directory of Open Access Journals (Sweden)

    Adedeji Nelson Ademakinwa

    2017-12-01

    A relatively low FTase-producing strain of Aureobasidium pullulans NAC8 was enhanced for optimum production using a two-pronged approach involving mutagenesis and statistical optimization. The improved mutant strain also had remarkable biotechnological properties that make it a suitable alternative than the wild-type.

  12. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  13. Neuro-Linguistic Programming: Improving Rapport between Track/Cross Country Coaches and Significant Others

    Science.gov (United States)

    Helm, David Jay

    2017-01-01

    This study examines the background information and the components of N.L.P., being eye movements, use of predicates, and posturing, as they apply to improving rapport and empathy between track/cross country coaches and their significant others in the arena of competition to help alleviate the inherent stressors.

  14. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  15. Improvements of ENSO-monsoon relationship in CMIP5 models through statistical downscaling over India.

    Science.gov (United States)

    Akhter, J.; Das, L.; Deb, A.

    2017-12-01

    Present study has assessed the skills of global climate models (GCMS) from coupled model inter-comparison project phase five (CMIP5) in simulating ENSO-monsoon relationships over seven homogeneous zones of India. Observational sea surface temperature (SST) data has revealed that there has been a significant negative correlation between zonal precipitation and Nino 3.4 index over North Mountainous India, North West India, North Central India, West Peninsular India and South Peninsular India. First and third principal component (PC) of zonal precipitation explaining 44.4% and 14.2% variance respectively has also shown significant anti-correlation with Nino 3.4. Analysis with CMIP5 models revealed that majority of GCMs have failed to reproduce both magnitude and phase of such relationships mainly due to poor simulation of Nino 3.4 index. Therefore, an attempt has been made to improve the results through empirical orthogonal function (EOF) based statistical downscaling of CMIP5 GCMs. To downscale Nino 3.4 index, an optimal predictor combination of PCs extracted from EOF fields of large scale GCM predictors like Geo-potential height, u and v wind, Specific and relative humidity and air temperature at pressure levels 500, 850 and 1000 hpa, mean sea level pressure and atmospheric vapor content has been utilized. Results indicated improvements of downscaled CMIP5 models in simulating ENSO-monsoon relationship for zone wise precipitation. Multi-model ensemble (MME) of downscaled GCMs has better skill than individuals GCM. Therefore, downscaled MME may be used more reliably to investigate future ENSO-monsoon relationship under various warming scenarios

  16. Adaptive statistical iterative reconstruction improves image quality without affecting perfusion CT quantitation in primary colorectal cancer

    Directory of Open Access Journals (Sweden)

    D. Prezzi

    Full Text Available Objectives: To determine the effect of Adaptive Statistical Iterative Reconstruction (ASIR on perfusion CT (pCT parameter quantitation and image quality in primary colorectal cancer. Methods: Prospective observational study. Following institutional review board approval and informed consent, 32 patients with colorectal adenocarcinoma underwent pCT (100 kV, 150 mA, 120 s acquisition, axial mode. Tumour regional blood flow (BF, blood volume (BV, mean transit time (MTT and permeability surface area product (PS were determined using identical regions-of-interests for ASIR percentages of 0%, 20%, 40%, 60%, 80% and 100%. Image noise, contrast-to-noise ratio (CNR and pCT parameters were assessed across ASIR percentages. Coefficients of variation (CV, repeated measures analysis of variance (rANOVA and Spearman’ rank order correlation were performed with statistical significance at 5%. Results: With increasing ASIR percentages, image noise decreased by 33% while CNR increased by 61%; peak tumour CNR was greater than 1.5 with 60% ASIR and above. Mean BF, BV, MTT and PS differed by less than 1.8%, 2.9%, 2.5% and 2.6% across ASIR percentages. CV were 4.9%, 4.2%, 3.3% and 7.9%; rANOVA P values: 0.85, 0.62, 0.02 and 0.81 respectively. Conclusions: ASIR improves image noise and CNR without altering pCT parameters substantially. Keywords: Perfusion imaging, Multidetector computed tomography, Colorectal neoplasms, Computer-assisted image processing, Radiation dosage

  17. Rehearsal significantly improves immediate and delayed recall on the Rey Auditory Verbal Learning Test.

    Science.gov (United States)

    Hessen, Erik

    2011-10-01

    A repeated observation during memory assessment with the Rey Auditory Verbal Learning Test (RAVLT) is that patients who spontaneously employ a memory rehearsal strategy by repeating the word list more than once achieve better scores than patients who only repeat the word list once. This observation led to concern about the ability of the standard test procedure of RAVLT and similar tests in eliciting the best possible recall scores. The purpose of the present study was to test the hypothesis that a rehearsal recall strategy of repeating the word list more than once would result in improved scores of recall on the RAVLT. We report on differences in outcome after standard administration and after experimental administration on Immediate and Delayed Recall measures from the RAVLT of 50 patients. The experimental administration resulted in significantly improved scores for all the variables employed. Additionally, it was found that patients who failed effort screening showed significantly poorer improvement on Delayed Recall compared with those who passed the effort screening. The general clear improvement both in raw scores and T-scores demonstrates that recall performance can be significantly influenced by the strategy of the patient or by small variations in instructions by the examiner.

  18. Costello Syndrome with Severe Nodulocystic Acne: Unexpected Significant Improvement of Acanthosis Nigricans after Oral Isotretinoin Treatment

    Directory of Open Access Journals (Sweden)

    Leelawadee Sriboonnark

    2015-01-01

    Full Text Available We report the case of 17-year-old female diagnosed with Costello syndrome. Genetic testing provided a proof with G12S mutation in the HRAS gene since 3 years of age with a presentation of severe nodulocystic acne on her face. After 2 months of oral isotretinoin treatment, improvement in her acne was observed. Interestingly, an unexpected significant improvement of acanthosis nigricans on her neck and dorsum of her hands was found as well. We present this case as a successful treatment option by using oral isotretinoin for the treatment of acanthosis nigricans in Costello syndrome patients.

  19. Determination of significance in Ecological Impact Assessment: Past change, current practice and future improvements

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, Sam; Hudson, Malcolm D., E-mail: mdh@soton.ac.uk

    2013-01-15

    Ecological Impact Assessment (EcIA) is an important tool for conservation and achieving sustainable development. 'Significant' impacts are those which disturb or alter the environment to a measurable degree. Significance is a crucial part of EcIA, our understanding of the concept in practice is vital if it is to be effective as a tool. This study employed three methods to assess how the determination of significance has changed through time, what current practice is, and what would lead to future improvements. Three data streams were collected: interviews with expert stakeholders, a review of 30 Environmental Statements and a broad-scale survey of the United Kingdom Institute of Ecology and Environmental Management (IEEM) members. The approach taken in the determination of significance has become more standardised and subjectivity has become constrained through a transparent framework. This has largely been driven by a set of guidelines produced by IEEM in 2006. The significance of impacts is now more clearly justified and the accuracy with which it is determined has improved. However, there are limitations to accuracy and effectiveness of the determination of significance. These are the quality of baseline survey data, our scientific understanding of ecological processes and the lack of monitoring and feedback of results. These in turn are restricted by the limited resources available in consultancies. The most notable recommendations for future practice are the implementation of monitoring and the publication of feedback, the creation of a central database for baseline survey data and the streamlining of guidance. - Highlights: Black-Right-Pointing-Pointer The assessment of significance has changed markedly through time. Black-Right-Pointing-Pointer The IEEM guidelines have driven a standardisation of practice. Black-Right-Pointing-Pointer Currently limited by quality of baseline data and scientific understanding. Black-Right-Pointing-Pointer Monitoring

  20. Histone deacetylase inhibitor significantly improved the cloning efficiency of porcine somatic cell nuclear transfer embryos.

    Science.gov (United States)

    Huang, Yongye; Tang, Xiaochun; Xie, Wanhua; Zhou, Yan; Li, Dong; Yao, Chaogang; Zhou, Yang; Zhu, Jianguo; Lai, Liangxue; Ouyang, Hongsheng; Pang, Daxin

    2011-12-01

    Valproic acid (VPA), a histone deacetylase inbibitor, has been shown to generate inducible pluripotent stem (iPS) cells from mouse and human fibroblasts with a significant higher efficiency. Because successful cloning by somatic cell nuclear transfer (SCNT) undergoes a full reprogramming process in which the epigenetic state of a differentiated donor nuclear is converted into an embryonic totipotent state, we speculated that VPA would be useful in promoting cloning efficiency. Therefore, in the present study, we examined whether VPA can promote the developmental competence of SCNT embryos by improving the reprogramming state of donor nucleus. Here we report that 1 mM VPA for 14 to 16 h following activation significantly increased the rate of blastocyst formation of porcine SCNT embryos constructed from Landrace fetal fibroblast cells compared to the control (31.8 vs. 11.4%). However, we found that the acetylation level of Histone H3 lysine 14 and Histone H4 lysine 5 and expression level of Oct4, Sox2, and Klf4 was not significantly changed between VPA-treated and -untreated groups at the blastocyst stage. The SCNT embryos were transferred to 38 surrogates, and the cloning efficiency in the treated group was significantly improved compared with the control group. Taken together, we have demonstrated that VPA can improve both in vitro and in vivo development competence of porcine SCNT embryos.

  1. Spectral and cross-spectral analysis of uneven time series with the smoothed Lomb-Scargle periodogram and Monte Carlo evaluation of statistical significance

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2012-12-01

    Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and

  2. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    Science.gov (United States)

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  3. Communication: Proper treatment of classically forbidden electronic transitions significantly improves detailed balance in surface hopping

    Energy Technology Data Exchange (ETDEWEB)

    Sifain, Andrew E. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Wang, Linjun [Department of Chemistry, Zhejiang University, Hangzhou 310027 (China); Prezhdo, Oleg V. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Department of Chemistry, University of Southern California, Los Angeles, California 90089-1062 (United States)

    2016-06-07

    Surface hopping is the most popular method for nonadiabatic molecular dynamics. Many have reported that it does not rigorously attain detailed balance at thermal equilibrium, but does so approximately. We show that convergence to the Boltzmann populations is significantly improved when the nuclear velocity is reversed after a classically forbidden hop. The proposed prescription significantly reduces the total number of classically forbidden hops encountered along a trajectory, suggesting that some randomization in nuclear velocity is needed when classically forbidden hops constitute a large fraction of attempted hops. Our results are verified computationally using two- and three-level quantum subsystems, coupled to a classical bath undergoing Langevin dynamics.

  4. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    Science.gov (United States)

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  5. Statistical versus Musical Significance: Commentary on Leigh VanHandel's 'National Metrical Types in Nineteenth Century Art Song'

    Directory of Open Access Journals (Sweden)

    Justin London

    2010-01-01

    Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”

  6. Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.

    Science.gov (United States)

    Monica, Stefania; Ferrari, Gianluigi

    2018-05-17

    Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.

  7. The challenges of ESRD care in developing economies: sub-Saharan African opportunities for significant improvement.

    Science.gov (United States)

    Bamgboye, Ebun Ladipo

    Chronic kidney disease (CKD) is a significant cause of morbidity and mortality in sub-Saharan Africa. This, along with other noncommunicable diseases like hypertension, diabetes, and heart diseases, poses a double burden on a region that is still struggling to cope with the scourge of communicable diseases like malaria, tuberculosis, HIV, and more recently Ebola. Causes of CKD in the region are predominantly glomerulonephritis and hypertension, although type 2 diabetes is also becoming a significant cause as is the retroviral disease. Patients are generally younger than in the developed world, and there is a significant male preponderance. Most patients are managed by hemodialysis, with peritoneal dialysis and kidney transplantation being available in only few countries in the region. Government funding and support for dialysis is often unavailable, and when available, often with restrictions. There is a dearth of trained manpower to treat the disease, and many countries have a limited number of units, which are often ill-equipped to deal adequately with the number of patients who require end-stage renal disease (ESRD) care in the region. Although there has been a significant improvement when compared with the situation, even as recently as 10 years ago, there is also the potential for further improvement, which would significantly improve the outcomes in patients with ESRD in the region. The information in this review was obtained from a combination of renal registry reports (published and unpublished), published articles, responses to a questionnaire sent to nephrologists prior to the World Congress of Nephrology (WCN) in Cape Town, and from nephrologists attending the WCN in Cape Town (March 13 - 17, 2015).

  8. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum

    Science.gov (United States)

    Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.

    2016-01-01

    Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (plearning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832

  9. Survival prediction algorithms miss significant opportunities for improvement if used for case selection in trauma quality improvement programs.

    Science.gov (United States)

    Heim, Catherine; Cole, Elaine; West, Anita; Tai, Nigel; Brohi, Karim

    2016-09-01

    Quality improvement (QI) programs have shown to reduce preventable mortality in trauma care. Detailed review of all trauma deaths is a time and resource consuming process and calculated probability of survival (Ps) has been proposed as audit filter. Review is limited on deaths that were 'expected to survive'. However no Ps-based algorithm has been validated and no study has examined elements of preventability associated with deaths classified as 'expected'. The objective of this study was to examine whether trauma performance review can be streamlined using existing mortality prediction tools without missing important areas for improvement. We conducted a retrospective study of all trauma deaths reviewed by our trauma QI program. Deaths were classified into non-preventable, possibly preventable, probably preventable or preventable. Opportunities for improvement (OPIs) involve failure in the process of care and were classified into clinical and system deviations from standards of care. TRISS and PS were used for calculation of probability of survival. Peer-review charts were reviewed by a single investigator. Over 8 years, 626 patients were included. One third showed elements of preventability and 4% were preventable. Preventability occurred across the entire range of the calculated Ps band. Limiting review to unexpected deaths would have missed over 50% of all preventability issues and a third of preventable deaths. 37% of patients showed opportunities for improvement (OPIs). Neither TRISS nor PS allowed for reliable identification of OPIs and limiting peer-review to patients with unexpected deaths would have missed close to 60% of all issues in care. TRISS and PS fail to identify a significant proportion of avoidable deaths and miss important opportunities for process and system improvement. Based on this, all trauma deaths should be subjected to expert panel review in order to aim at a maximal output of performance improvement programs. Copyright © 2016 Elsevier

  10. Intraoperative Sensorcaine significantly improves postoperative pain management in outpatient reduction mammaplasty.

    Science.gov (United States)

    Culliford, Alfred T; Spector, Jason A; Flores, Roberto L; Louie, Otway; Choi, Mihye; Karp, Nolan S

    2007-09-15

    Breast reduction is one of the most frequently performed plastic surgical procedures in the United States; more than 160,500 patients underwent the procedure in 2005. Many outpatient reduction mammaplasty patients report the greatest postoperative discomfort in the first 48 hours. The authors' investigated the effect of intraoperative topical application of the long-acting local anesthetic agent bupivacaine (Sensorcaine or Marcaine) on postoperative pain, time to postanesthesia care unit discharge, and postoperative use of narcotic medication. In a prospective, randomized, single-blind trial, intraoperative use of Sensorcaine versus placebo (normal saline) was compared. Postoperative pain was quantified using the visual analogue scale, and time to discharge from the postanesthesia care unit was recorded. Patients documented their outpatient pain medication usage. Of the 37 patients enrolled in the study, 20 were treated with intraoperative topical Sensorcaine and 17 received placebo. Patients treated with Sensorcaine were discharged home significantly faster (2.9 hours versus 3.8 hours, p = 0.002). The control arm consistently had higher pain scores in the postanesthesia care unit (although not statistically significant) than the Sensorcaine group using the visual analogue scale system. Furthermore, patients receiving Sensorcaine required significantly less narcotic medication while recovering at home (mean, 3.5 tablets of Vicodin) than the control group (mean, 6.4 tablets; p = 0.001). There were no complications resulting from Sensorcaine usage. This prospective, randomized, single-blind study demonstrates that a single dose of intraoperative Sensorcaine provides a safe, inexpensive, and efficacious way to significantly shorten the length of postanesthesia care unit stay and significantly decrease postoperative opioid analgesic use in patients undergoing ambulatory reduction mammaplasty.

  11. Statistical and molecular analyses of evolutionary significance of red-green color vision and color blindness in vertebrates.

    Science.gov (United States)

    Yokoyama, Shozo; Takenaka, Naomi

    2005-04-01

    Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.

  12. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  13. PXD101 significantly improves nuclear reprogramming and the in vitro developmental competence of porcine SCNT embryos

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Jun-Xue; Kang, Jin-Dan; Li, Suo; Jin, Long; Zhu, Hai-Ying; Guo, Qing; Gao, Qing-Shan; Yan, Chang-Guo; Yin, Xi-Jun, E-mail: yinxj33@msn.com

    2015-01-02

    Highlights: • First explored that the effects of PXD101 on the development of SCNT embryos in vitro. • 0.5 μM PXD101 treated for 24 h improved the development of porcine SCNT embryos. • Level of AcH3K9 was significantly higher than control group at early stages. - Abstract: In this study, we investigated the effects of the histone deacetylase inhibitor PXD101 (belinostat) on the preimplantation development of porcine somatic cell nuclear transfer (SCNT) embryos and their expression of the epigenetic markers histone H3 acetylated at lysine 9 (AcH3K9). We compared the in vitro developmental competence of SCNT embryos treated with various concentrations of PXD101 for 24 h. Treatment with 0.5 μM PXD101 significantly increased the proportion of SCNT embryos that reached the blastocyst stage, in comparison to the control group (23.3% vs. 11.5%, P < 0.05). We tested the in vitro developmental competence of SCNT embryos treated with 0.5 μM PXD101 for various amounts of times following activation. Treatment for 24 h significantly improved the development of porcine SCNT embryos, with a significantly higher proportion of embryos reaching the blastocyst stage in comparison to the control group (25.7% vs. 10.6%, P < 0.05). PXD101-treated SCNT embryos were transferred into two surrogate sows, one of whom became pregnant and four fetuses developed. PXD101 treatment significantly increased the fluorescence intensity of immunostaining for AcH3K9 in embryos at the pseudo-pronuclear and 2-cell stages. At these stages, the fluorescence intensities of immunostaining for AcH3K9 were significantly higher in PXD101-treated embryos than in control untreated embryos. In conclusion, this study demonstrates that PXD101 can significantly improve the in vitro and in vivo developmental competence of porcine SCNT embryos and can enhance their nuclear reprogramming.

  14. Unified Health Gamification can significantly improve well-being in corporate environments.

    Science.gov (United States)

    Shahrestani, Arash; Van Gorp, Pieter; Le Blanc, Pascale; Greidanus, Fabrizio; de Groot, Kristel; Leermakers, Jelle

    2017-07-01

    There is a multitude of mHealth applications that aim to solve societal health problems by stimulating specific types of physical activities via gamification. However, physical health activities cover just one of the three World Health Organization (WHO) dimensions of health. This paper introduces the novel notion of Unified Health Gamification (UHG), which covers besides physical health also social and cognitive health and well-being. Instead of rewarding activities in the three WHO dimensions using different mHealth competitions, UHG combines the scores for such activities on unified leaderboards and lets people interact in social circles beyond personal interests. This approach is promising in corporate environments since UHG can connect the employees with intrinsic motivation for physical health with those who have quite different interests. In order to evaluate this approach, we realized an app prototype and we evaluated it in two corporate pilot studies. In total, eighteen pilot users participated voluntarily for six weeks. Half of the participants were recruited from an occupational health setting and the other half from a treatment setting. Our results suggest that the UHG principles are worth more investigation: various positive health effects were found based on a validated survey. The mean mental health improved significantly at one pilot location and at the level of individual pilot participants, multiple other effects were found to be significant: among others, significant mental health improvements were found for 28% of the participants. Most participants intended to use the app beyond the pilot, especially if it would be further developed.

  15. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum.

    Science.gov (United States)

    Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M

    2016-01-01

    Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (pstatistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.

  16. Improving esthetic results in benign parotid surgery: statistical evaluation of facelift approach, sternocleidomastoid flap, and superficial musculoaponeurotic system flap application.

    Science.gov (United States)

    Bianchi, Bernardo; Ferri, Andrea; Ferrari, Silvano; Copelli, Chiara; Sesenna, Enrico

    2011-04-01

    The purpose of this article was to analyze the efficacy of facelift incision, sternocleidomastoid muscle flap, and superficial musculoaponeurotic system flap for improving the esthetic results in patients undergoing partial parotidectomy for benign parotid tumor resection. The usefulness of partial parotidectomy is discussed, and a statistical evaluation of the esthetic results was performed. From January 1, 1996, to January 1, 2007, 274 patients treated for benign parotid tumors were studied. Of these, 172 underwent partial parotidectomy. The 172 patients were divided into 4 groups: partial parotidectomy with classic or modified Blair incision without reconstruction (group 1), partial parotidectomy with facelift incision and without reconstruction (group 2), partial parotidectomy with facelift incision associated with sternocleidomastoid muscle flap (group 3), and partial parotidectomy with facelift incision associated with superficial musculoaponeurotic system flap (group 4). Patients were considered, after a follow-up of at least 18 months, for functional and esthetic evaluation. The functional outcome was assessed considering the facial nerve function, Frey syndrome, and recurrence. The esthetic evaluation was performed by inviting the patients and a blind panel of 1 surgeon and 2 secretaries of the department to give a score of 1 to 10 to assess the final cosmetic outcome. The statistical analysis was finally performed using the Mann-Whitney U test for nonparametric data to compare the different group results. P less than .05 was considered significant. No recurrence developed in any of the 4 groups or in any of the 274 patients during the follow-up period. The statistical analysis, comparing group 1 and the other groups, revealed a highly significant statistical difference (P esthetic results in benign parotid surgery. The evaluation of functional complications and the recurrence rate in this series of patients has confirmed that this technique can be safely

  17. The European Academy laparoscopic “Suturing Training and Testing’’ (SUTT) significantly improves surgeons’ performance

    Science.gov (United States)

    Sleiman, Z.; Tanos, V.; Van Belle, Y.; Carvalho, J.L.; Campo, R.

    2015-01-01

    The efficiency of suturing training and testing (SUTT) model by laparoscopy was evaluated, measuring the suturingskill acquisition of trainee gynecologists at the beginning and at the end of a teaching course. During a workshop organized by the European Academy of Gynecological Surgery (EAGS), 25 participants with three different experience levels in laparoscopy (minor, intermediate and major) performed the 4 exercises of the SUTT model (Ex 1: both hands stitching and continuous suturing, Ex 2: right hand stitching and intracorporeal knotting, Ex 3: left hand stitching and intracorporeal knotting, Ex 4: dominant hand stitching, tissue approximation and intracorporeal knotting). The time needed to perform the exercises is recorded for each trainee and group and statistical analysis used to note the differences. Overall, all trainees achieved significant improvement in suturing time (p psychomotor skills, surgery, teaching, training suturing model. PMID:26977264

  18. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Science.gov (United States)

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. [More than a decade improving medical and judicial certification in mortality statistics of death causes].

    Science.gov (United States)

    Cirera, Lluís; Salmerón, Diego; Martínez, Consuelo; Bañón, Rafael María; Navarro, Carmen

    2018-06-06

    After the return of Spain to democracy and the regional assumption of government powers, actions were initiated to improve the mortality statistics of death causes. The objective of this work was to describe the evolution of the quality activities improvements into the statistics of death causes on Murcia's region during 1989 to 2011. Descriptive epidemiological study of all death documents processed by the Murcia mortality registry. Use of indicators related to the quality of the completion of death in medical and judicial notification; recovery of information on the causes and circumstances of death; and impact on the statistics of ill-defined, unspecific and less specific causes. During the study period, the medical notification without a temporary sequence on the death certificate (DC) has decreased from 46% initial to 21% final (p less than 0.001). Information retrieval from sources was successful in 93% of the cases in 2001 compared to 38%, at the beginning of the period (p less than 0.001). Regional rates of ill-defined and unspecific causes fell more than national ones, and they were in the last year with a differential of 10.3 (p less than 0.001) and 2.8 points (p=0.001), respectively. The medical death certification improved in form and suitability. Regulated recovery of the causes of death and circumstances corrected medical and judicial information. The Murcia's region presented lower rates in less specified causes and ill-defined entities than national averages.

  20. E2F5 status significantly improves malignancy diagnosis of epithelial ovarian cancer

    KAUST Repository

    Kothandaraman, Narasimhan

    2010-02-24

    Background: Ovarian epithelial cancer (OEC) usually presents in the later stages of the disease. Factors, especially those associated with cell-cycle genes, affecting the genesis and tumour progression for ovarian cancer are largely unknown. We hypothesized that over-expressed transcription factors (TFs), as well as those that are driving the expression of the OEC over-expressed genes, could be the key for OEC genesis and potentially useful tissue and serum markers for malignancy associated with OEC.Methods: Using a combination of computational (selection of candidate TF markers and malignancy prediction) and experimental approaches (tissue microarray and western blotting on patient samples) we identified and evaluated E2F5 transcription factor involved in cell proliferation, as a promising candidate regulatory target in early stage disease. Our hypothesis was supported by our tissue array experiments that showed E2F5 expression only in OEC samples but not in normal and benign tissues, and by significantly positively biased expression in serum samples done using western blotting studies.Results: Analysis of clinical cases shows that of the E2F5 status is characteristic for a different population group than one covered by CA125, a conventional OEC biomarker. E2F5 used in different combinations with CA125 for distinguishing malignant cyst from benign cyst shows that the presence of CA125 or E2F5 increases sensitivity of OEC detection to 97.9% (an increase from 87.5% if only CA125 is used) and, more importantly, the presence of both CA125 and E2F5 increases specificity of OEC to 72.5% (an increase from 55% if only CA125 is used). This significantly improved accuracy suggests possibility of an improved diagnostics of OEC. Furthermore, detection of malignancy status in 86 cases (38 benign, 48 early and late OEC) shows that the use of E2F5 status in combination with other clinical characteristics allows for an improved detection of malignant cases with sensitivity

  1. Significance of MPEG-7 textural features for improved mass detection in mammography.

    Science.gov (United States)

    Eltonsy, Nevine H; Tourassi, Georgia D; Fadeev, Aleksey; Elmaghraby, Adel S

    2006-01-01

    The purpose of the study is to investigate the significance of MPEG-7 textural features for improving the detection of masses in screening mammograms. The detection scheme was originally based on morphological directional neighborhood features extracted from mammographic regions of interest (ROIs). Receiver Operating Characteristics (ROC) was performed to evaluate the performance of each set of features independently and merged into a back-propagation artificial neural network (BPANN) using the leave-one-out sampling scheme (LOOSS). The study was based on a database of 668 mammographic ROIs (340 depicting cancer regions and 328 depicting normal parenchyma). Overall, the ROC area index of the BPANN using the directional morphological features was Az=0.85+/-0.01. The MPEG-7 edge histogram descriptor-based BPNN showed an ROC area index of Az=0.71+/-0.01 while homogeneous textural descriptors using 30 and 120 channels helped the BPNN achieve similar ROC area indexes of Az=0.882+/-0.02 and Az=0.877+/-0.01 respectively. After merging the MPEG-7 homogeneous textural features with the directional neighborhood features the performance of the BPANN increased providing an ROC area index of Az=0.91+/-0.01. MPEG-7 homogeneous textural descriptor significantly improved the morphology-based detection scheme.

  2. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    Science.gov (United States)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  3. Carfilzomib significantly improves the progression-free survival of high-risk patients in multiple myeloma.

    Science.gov (United States)

    Avet-Loiseau, Hervé; Fonseca, Rafael; Siegel, David; Dimopoulos, Meletios A; Špička, Ivan; Masszi, Tamás; Hájek, Roman; Rosiñol, Laura; Goranova-Marinova, Vesselina; Mihaylov, Georgi; Maisnar, Vladimír; Mateos, Maria-Victoria; Wang, Michael; Niesvizky, Ruben; Oriol, Albert; Jakubowiak, Andrzej; Minarik, Jiri; Palumbo, Antonio; Bensinger, William; Kukreti, Vishal; Ben-Yehuda, Dina; Stewart, A Keith; Obreja, Mihaela; Moreau, Philippe

    2016-09-01

    The presence of certain high-risk cytogenetic abnormalities, such as translocations (4;14) and (14;16) and deletion (17p), are known to have a negative impact on survival in multiple myeloma (MM). The phase 3 study ASPIRE (N = 792) demonstrated that progression-free survival (PFS) was significantly improved with carfilzomib, lenalidomide, and dexamethasone (KRd), compared with lenalidomide and dexamethasone (Rd) in relapsed MM. This preplanned subgroup analysis of ASPIRE was conducted to evaluate KRd vs Rd by baseline cytogenetics according to fluorescence in situ hybridization. Of 417 patients with known cytogenetic risk status, 100 patients (24%) were categorized with high-risk cytogenetics (KRd, n = 48; Rd, n = 52) and 317 (76%) were categorized with standard-risk cytogenetics (KRd, n = 147; Rd, n = 170). For patients with high-risk cytogenetics, treatment with KRd resulted in a median PFS of 23.1 months, a 9-month improvement relative to treatment with Rd. For patients with standard-risk cytogenetics, treatment with KRd led to a 10-month improvement in median PFS vs Rd. The overall response rates for KRd vs Rd were 79.2% vs 59.6% (high-risk cytogenetics) and 91.2% vs 73.5% (standard-risk cytogenetics); approximately fivefold as many patients with high- or standard-risk cytogenetics achieved a complete response or better with KRd vs Rd (29.2% vs 5.8% and 38.1% vs 6.5%, respectively). KRd improved but did not abrogate the poor prognosis associated with high-risk cytogenetics. This regimen had a favorable benefit-risk profile in patients with relapsed MM, irrespective of cytogenetic risk status, and should be considered a standard of care in these patients. This trial was registered at www.clinicaltrials.gov as #NCT01080391. © 2016 by The American Society of Hematology.

  4. Bone Mass and Strength are Significantly Improved in Mice Overexpressing Human WNT16 in Osteocytes.

    Science.gov (United States)

    Alam, Imranul; Reilly, Austin M; Alkhouli, Mohammed; Gerard-O'Riley, Rita L; Kasipathi, Charishma; Oakes, Dana K; Wright, Weston B; Acton, Dena; McQueen, Amie K; Patel, Bhavmik; Lim, Kyung-Eun; Robling, Alexander G; Econs, Michael J

    2017-04-01

    Recently, we demonstrated that osteoblast-specific overexpression of human WNT16 increased both cortical and trabecular bone mass and structure in mice. To further identify the cell-specific role of Wnt16 in bone homeostasis, we created transgenic (TG) mice overexpressing human WNT16 in osteocytes using Dmp1 promoter (Dmp1-hWNT16 TG) on C57BL/6 (B6) background. We analyzed bone phenotypes and serum bone biomarkers, performed gene expression analysis and measured dynamic bone histomorphometry in Dmp1-hWNT16 TG and wild-type (WT) mice. Compared to WT mice, Dmp1-hWNT16 TG mice exhibited significantly higher whole-body, spine and femoral aBMD, BMC and trabecular (BV/TV, Tb.N, and Tb.Th) and cortical (bone area and thickness) parameters in both male and female at 12 weeks of age. Femur stiffness and ultimate force were also significantly improved in the Dmp1-hWNT16 TG female mice, compared to sex-matched WT littermates. In addition, female Dmp1-hWNT16 TG mice displayed significantly higher MS/BS, MAR and BFR/BS compared to the WT mice. Gene expression analysis demonstrated significantly higher mRNA level of Alp in both male and female Dmp1-hWNT16 TG mice and significantly higher levels of Osteocalcin, Opg and Rankl in the male Dmp1-hWNT16 TG mice in bone tissue compared to sex-matched WT mice. These results indicate that WNT16 plays a critical role for acquisition of both cortical and trabecular bone mass and strength. Strategies designed to use WNT16 as a target for therapeutic interventions will be valuable to treat osteoporosis and other low bone mass conditions.

  5. Integration of biomimicry and nanotechnology for significantly improved detection of circulating tumor cells (CTCs).

    Science.gov (United States)

    Myung, Ja Hye; Park, Sin-Jung; Wang, Andrew Z; Hong, Seungpyo

    2017-12-13

    Circulating tumor cells (CTCs) have received a great deal of scientific and clinical attention as a biomarker for diagnosis and prognosis of many types of cancer. Given their potential significance in clinics, a variety of detection methods, utilizing the recent advances in nanotechnology and microfluidics, have been introduced in an effort of achieving clinically significant detection of CTCs. However, effective detection and isolation of CTCs still remain a tremendous challenge due to their extreme rarity and phenotypic heterogeneity. Among many approaches that are currently under development, this review paper focuses on a unique, promising approach that takes advantages of naturally occurring processes achievable through application of nanotechnology to realize significant improvement in sensitivity and specificity of CTC capture. We provide an overview of successful outcome of this biomimetic CTC capture system in detection of tumor cells from in vitro, in vivo, and clinical pilot studies. We also emphasize the clinical impact of CTCs as biomarkers in cancer diagnosis and predictive prognosis, which provides a cost-effective, minimally invasive method that potentially replaces or supplements existing methods such as imaging technologies and solid tissue biopsy. In addition, their potential prognostic values as treatment guidelines and that ultimately help to realize personalized therapy are discussed. Copyright © 2017. Published by Elsevier B.V.

  6. Induction Based Training leads to Highly Significant Improvements of Objective and Subjective Suturing Ability in Junior Doctors

    Directory of Open Access Journals (Sweden)

    Kevin Garry

    2018-03-01

    Full Text Available Background: Simulation based training has shown to be of benefit in the education of medical students. However, the impact of induction based clinical simulation on surgical ability of qualified doctors remains unclear.The aim of this study was to establish if a 60 minute teaching session integrated into an Emergency Medicine speciality induction program produces statistically significant improvements in objective and subjective suturing abilities of junior doctors commencing an Emergency Medicine rotation.Methods: The objective suturing abilities of 16 Foundation Year Two doctors were analysed using a validated OSATs scale prior to a novel teaching intervention. The doctors then undertook an intensive hour long workshop receiving one to one feedback before undergoing repeat OSATs assessment.Subjective ability was measured using a 5 point likert scale and self-assessed competency reporting interrupted suturing before and after the intervention. Photographs of wound closure before and after the intervention were recorded for further blinded assessment of impact of intervention. A survey regarding continued ability was repeated at four months following the intervention. The study took place on 7/12/16 during the Belfast Health and Social Care Trust Emergency Medicine induction in the Royal Victoria Hospital Belfast. The hospital is a regional level 1 trauma centre that has annual departmental attendances in excess of 200,000.All new junior doctors commencing the Emergency Medicine rotation were invited to partake in the study. All 16 agreed. The group consisted of a mixture of undergraduate and postgraduate medicaldoctors who all had 16 months experience working in a variety of medical or surgical jobs previously.Results: Following the teaching intervention objective and subjective abilities in interrupted suturing showed statistically significant improvement (P>0.005. Self-reporting of competency of independently suturingwounds improved from 50

  7. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  8. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum.

    Directory of Open Access Journals (Sweden)

    Natasa M Milic

    Full Text Available Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face learning to further assess the potential value of web-based learning in medical statistics.This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545 the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course.Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001 and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023 with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001.This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional

  9. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  10. Significant Improvement of Organic Thin-Film Transistor Mobility Utilizing an Organic Heterojunction Buffer Layer

    International Nuclear Information System (INIS)

    Pan Feng; Qian Xian-Rui; Huang Li-Zhen; Wang Hai-Bo; Yan Dong-Hang

    2011-01-01

    High-mobility vanadyl phthalocyanine (VOPc)/5,5‴-bis(4-fluorophenyl)-2,2':5',2″:5″,2‴-quaterthiophene (F2-P4T) thin-film transistors are demonstrated by employing a copper hexadecafluorophthalocyanine (F 16 CuPc)/copper phthalocyanine (CuPc) heterojunction unit, which are fabricated at different substrate temperatures, as a buffer layer. The highest mobility of 4.08cm 2 /Vs is achieved using a F 16 CuPc/CuPc organic heterojunction buffer layer fabricated at high substrate temperature. Compared with the random small grain-like morphology of the room-temperature buffer layer, the high-temperature organic heterojunction presents a large-sized fiber-like film morphology, resulting in an enhanced conductivity. Thus the contact resistance of the transistor is significantly reduced and an obvious improvement in device mobility is obtained. (cross-disciplinary physics and related areas of science and technology)

  11. Significant improvement of optical traps by tuning standard water immersion objectives

    International Nuclear Information System (INIS)

    Reihani, S Nader S; Mir, Shahid A; Richardson, Andrew C; Oddershede, Lene B

    2011-01-01

    Focused infrared lasers are widely used for micromanipulation and visualization of biological specimens. An inherent practical problem is that off-the-shelf commercial microscope objectives are designed for use with visible and not infrared wavelengths. Less aberration is introduced by water immersion objectives than by oil immersion ones, however, even water immersion objectives induce significant aberration. We present a simple method to reduce the spherical aberration induced by water immersion objectives, namely by tuning the correction collar of the objective to a value that is ∼ 10% lower than the physical thickness of the coverslip. This results in marked improvements in optical trapping strengths of up to 100% laterally and 600% axially from a standard microscope objective designed for use in the visible range. The results are generally valid for any water immersion objective with any numerical aperture

  12. Significant improvement in the electrical characteristics of Schottky barrier diodes on molecularly modified Gallium Nitride surfaces

    Science.gov (United States)

    Garg, Manjari; Naik, Tejas R.; Pathak, C. S.; Nagarajan, S.; Rao, V. Ramgopal; Singh, R.

    2018-04-01

    III-Nitride semiconductors face the issue of localized surface states, which causes fermi level pinning and large leakage current at the metal semiconductor interface, thereby degrading the device performance. In this work, we have demonstrated the use of a Self-Assembled Monolayer (SAM) of organic molecules to improve the electrical characteristics of Schottky barrier diodes (SBDs) on n-type Gallium Nitride (n-GaN) epitaxial films. The electrical characteristics of diodes were improved by adsorption of SAM of hydroxyl-phenyl metallated porphyrin organic molecules (Zn-TPPOH) onto the surface of n-GaN. SAM-semiconductor bonding via native oxide on the n-GaN surface was confirmed using X-ray photoelectron spectroscopy measurements. Surface morphology and surface electronic properties were characterized using atomic force microscopy and Kelvin probe force microscopy. Current-voltage characteristics of different metal (Cu, Ni) SBDs on bare n-GaN were compared with those of Cu/Zn-TPPOH/n-GaN and Ni/Zn-TPPOH/n-GaN SBDs. It was found that due to the molecular monolayer, the surface potential of n-GaN was decreased by ˜350 mV. This caused an increase in the Schottky barrier height of Cu and Ni SBDs from 1.13 eV to 1.38 eV and 1.07 eV to 1.22 eV, respectively. In addition to this, the reverse bias leakage current was reduced by 3-4 orders of magnitude for both Cu and Ni SBDs. Such a significant improvement in the electrical performance of the diodes can be very useful for better device functioning.

  13. pEPito: a significantly improved non-viral episomal expression vector for mammalian cells

    Directory of Open Access Journals (Sweden)

    Ogris Manfred

    2010-03-01

    Full Text Available Abstract Background The episomal replication of the prototype vector pEPI-1 depends on a transcription unit starting from the constitutively expressed Cytomegalovirus immediate early promoter (CMV-IEP and directed into a 2000 bp long matrix attachment region sequence (MARS derived from the human β-interferon gene. The original pEPI-1 vector contains two mammalian transcription units and a total of 305 CpG islands, which are located predominantly within the vector elements necessary for bacterial propagation and known to be counterproductive for persistent long-term transgene expression. Results Here, we report the development of a novel vector pEPito, which is derived from the pEPI-1 plasmid replicon but has considerably improved efficacy both in vitro and in vivo. The pEPito vector is significantly reduced in size, contains only one transcription unit and 60% less CpG motives in comparison to pEPI-1. It exhibits major advantages compared to the original pEPI-1 plasmid, including higher transgene expression levels and increased colony-forming efficiencies in vitro, as well as more persistent transgene expression profiles in vivo. The performance of pEPito-based vectors was further improved by replacing the CMV-IEP with the human CMV enhancer/human elongation factor 1 alpha promoter (hCMV/EF1P element that is known to be less affected by epigenetic silencing events. Conclusions The novel vector pEPito can be considered suitable as an improved vector for biotechnological applications in vitro and for non-viral gene delivery in vivo.

  14. Electronic monitoring in combination with direct observation as a means to significantly improve hand hygiene compliance.

    Science.gov (United States)

    Boyce, John M

    2017-05-01

    Monitoring hand hygiene compliance among health care personnel (HCP) is an essential element of hand hygiene promotion programs. Observation by trained auditors is considered the gold standard method for establishing hand hygiene compliance rates. Advantages of observational surveys include the unique ability to establish compliance with all of the World Health Organization "My 5 Moments for Hand Hygiene" initiative Moments and to provide just-in-time coaching. Disadvantages include the resources required for observational surveys, insufficient sample sizes, and nonstandardized methods of conducting observations. Electronic and camera-based systems can monitor hand hygiene performance on all work shifts without a Hawthorne effect and provide significantly more data regarding hand hygiene performance. Disadvantages include the cost of installation, variable accuracy in estimating compliance rates, issues related to acceptance by HCP, insufficient data regarding their cost-effectiveness and influence on health care-related infection rates, and the ability of most systems to monitor only surrogates for Moments 1, 4, and 5. Increasing evidence suggests that monitoring only Moments 1, 4, and 5 provides reasonable estimates of compliance with all 5 Moments. With continued improvement of electronic monitoring systems, combining electronic monitoring with observational methods may provide the best information as part of a multimodal strategy to improve and sustain hand hygiene compliance rates among HCP. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  15. Fabrication of CoZn alloy nanowire arrays: Significant improvement in magnetic properties by annealing process

    International Nuclear Information System (INIS)

    Koohbor, M.; Soltanian, S.; Najafi, M.; Servati, P.

    2012-01-01

    Highlights: ► Increasing the Zn concentration changes the structure of NWs from hcp to amorphous. ► Increasing the Zn concentration significantly reduces the Hc value of NWs. ► Magnetic properties of CoZn NWs can be significantly enhanced by appropriate annealing. ► The pH of electrolyte has no significant effect on the properties of the NW arrays. ► Deposition frequency has considerable effects on the magnetic properties of NWs. - Abstract: Highly ordered arrays of Co 1−x Zn x (0 ≤ x ≤ 0.74) nanowires (NWs) with diameters of ∼35 nm and high length-to-diameter ratios (up to 150) were fabricated by co-electrodeposition of Co and Zn into pores of anodized aluminum oxide (AAO) templates. The Co and Zn contents of the NWs were adjusted by varying the ratio of Zn and Co ion concentrations in the electrolyte. The effect of the Zn content, electrodeposition conditions (frequency and pH) and annealing on the structural and magnetic properties (e.g., coercivity (Hc) and squareness (Sq)) of NW arrays were investigated using X-ray diffraction (XRD), scanning electron microscopy, electron diffraction, and alternating gradient force magnetometer (AGFM). XRD patterns reveal that an increase in the concentration of Zn ions of the electrolyte forces the hcp crystal structure of Co NWs to change into an amorphous phase, resulting in a significant reduction in Hc. It was found that the magnetic properties of NWs can be significantly improved by appropriate annealing process. The highest values for Hc (2050 Oe) and Sq (0.98) were obtained for NWs electrodeposited using 0.95/0.05 Co:Zn concentrations at 200 Hz and annealed at 575 °C. While the pH of electrolyte is found to have no significant effect on the structural and magnetic properties of the NW arrays, the electrodeposition frequency has considerable effects on the magnetic properties of the NW arrays. The changes in magnetic property of NWs are rooted in a competition between shape anisotropy and

  16. Fabrication of CoZn alloy nanowire arrays: Significant improvement in magnetic properties by annealing process

    Energy Technology Data Exchange (ETDEWEB)

    Koohbor, M. [Department of Physics, University of Kurdistan, Sanandaj (Iran, Islamic Republic of); Soltanian, S., E-mail: s.soltanian@gmail.com [Department of Physics, University of Kurdistan, Sanandaj (Iran, Islamic Republic of); Department of Electrical and Computer Engineering, University of British Columbia, Vancouver (Canada); Najafi, M. [Department of Physics, University of Kurdistan, Sanandaj (Iran, Islamic Republic of); Department of Physics, Hamadan University of Technology, Hamadan (Iran, Islamic Republic of); Servati, P. [Department of Electrical and Computer Engineering, University of British Columbia, Vancouver (Canada)

    2012-01-05

    Highlights: Black-Right-Pointing-Pointer Increasing the Zn concentration changes the structure of NWs from hcp to amorphous. Black-Right-Pointing-Pointer Increasing the Zn concentration significantly reduces the Hc value of NWs. Black-Right-Pointing-Pointer Magnetic properties of CoZn NWs can be significantly enhanced by appropriate annealing. Black-Right-Pointing-Pointer The pH of electrolyte has no significant effect on the properties of the NW arrays. Black-Right-Pointing-Pointer Deposition frequency has considerable effects on the magnetic properties of NWs. - Abstract: Highly ordered arrays of Co{sub 1-x}Zn{sub x} (0 {<=} x {<=} 0.74) nanowires (NWs) with diameters of {approx}35 nm and high length-to-diameter ratios (up to 150) were fabricated by co-electrodeposition of Co and Zn into pores of anodized aluminum oxide (AAO) templates. The Co and Zn contents of the NWs were adjusted by varying the ratio of Zn and Co ion concentrations in the electrolyte. The effect of the Zn content, electrodeposition conditions (frequency and pH) and annealing on the structural and magnetic properties (e.g., coercivity (Hc) and squareness (Sq)) of NW arrays were investigated using X-ray diffraction (XRD), scanning electron microscopy, electron diffraction, and alternating gradient force magnetometer (AGFM). XRD patterns reveal that an increase in the concentration of Zn ions of the electrolyte forces the hcp crystal structure of Co NWs to change into an amorphous phase, resulting in a significant reduction in Hc. It was found that the magnetic properties of NWs can be significantly improved by appropriate annealing process. The highest values for Hc (2050 Oe) and Sq (0.98) were obtained for NWs electrodeposited using 0.95/0.05 Co:Zn concentrations at 200 Hz and annealed at 575 Degree-Sign C. While the pH of electrolyte is found to have no significant effect on the structural and magnetic properties of the NW arrays, the electrodeposition frequency has considerable effects on

  17. Compression stockings significantly improve hemodynamic performance in post-thrombotic syndrome irrespective of class or length.

    Science.gov (United States)

    Lattimer, Christopher R; Azzam, Mustapha; Kalodiki, Evi; Makris, Gregory C; Geroulakos, George

    2013-07-01

    Graduated elastic compression (GEC) stockings have been demonstrated to reduce the morbidity associated with post-thrombotic syndrome. The ideal length or compression strength required to achieve this is speculative and related to physician preference and patient compliance. The aim of this study was to evaluate the hemodynamic performance of four different stockings and determine the patient's preference. Thirty-four consecutive patients (40 legs, 34 male) with post-thrombotic syndrome were tested with four different stockings (Mediven plus open toe, Bayreuth, Germany) of their size in random order: class 1 (18-21 mm Hg) and class II (23-32 mm Hg), below-knee (BK) and above-knee thigh-length (AK). The median age, Venous Clinical Severity Score, Venous Segmental Disease Score, and Villalta scale were 62 years (range, 31-81 years), 8 (range, 1-21), 5 (range, 2-10), and 10 (range, 2-22), respectively. The C of C0-6EsAs,d,pPr,o was C0 = 2, C2 = 1, C3 = 3, C4a = 12, C4b = 7, C5 = 12, C6 = 3. Obstruction and reflux was observed on duplex in 47.5% legs, with deep venous reflux alone in 45%. Air plethysmography was used to measure the venous filling index (VFI), venous volume, and time to fill 90% of the venous volume. Direct pressure measurements were obtained while lying and standing using the PicoPress device (Microlab Elettronica, Nicolò, Italy). The pressure sensor was placed underneath the test stocking 5 cm above and 2 cm posterior to the medial malleolus. At the end of the study session, patients stated their preferred stocking based on comfort. The VFI, venous volume, and time to fill 90% of the venous volume improved significantly with all types of stocking versus no compression. In class I, the VFI (mL/s) improved from a median of 4.9 (range, 1.7-16.3) without compression to 3.7 (range, 0-14) BK (24.5%) and 3.6 (range, 0.6-14.5) AK (26.5%). With class II, the corresponding improvement was to 4.0 (range, 0.3-16.2) BK (18.8%) and 3.7 (range, 0.5-14.2) AK (24

  18. Significant improvement of intestinal microbiota of gibel carp (Carassius auratus gibelio) after traditional Chinese medicine feeding.

    Science.gov (United States)

    Wu, Z B; Gatesoupe, F-J; Li, T T; Wang, X H; Zhang, Q Q; Feng, D Y; Feng, Y Q; Chen, H; Li, A H

    2018-03-01

    Increasing attention has been attracted to intestinal microbiota, due to interactions with nutrition, metabolism and immune defence of the host. Traditional Chinese medicine (TCM) feed additives have been applied in aquaculture to improve fish health, but the interaction with fish gut microbiota is still poorly understood. This study aimed to explore the effect of adding TCM in feed on the intestinal microbiota of gibel carp (Carassius auratus gibelio). Bacterial communities of 16 fish intestinal contents and one water sample were characterized by high-throughput sequencing and analysis of the V4-V5 region of the 16S rRNA gene. The results showed that the composition and structure of the bacterial community were significantly altered by the TCM feeding. Some phyla increased markedly (Proteobacteria, Actinobacteria, Acidobacteria, etc.), while Fusobacteria were significantly reduced. Concurrently, the richness and diversity of the taxonomic units increased, and the microbiota composition of TCM-treated fish was more homogeneous among individuals. At the genus level, the addition of TCM tended to reduce the incidence of potential pathogens (Aeromonas, Acinetobacter and Shewanella), while stimulating the emergence of some potential probiotics (Lactobacillus, Lactococcus, Bacillus and Pseudomonas). These data suggested that the feed additive could regulate the fish intestinal microbiota by reinforcing the microbial balance. This study may provide useful information for further application of TCM for diseases prevention and stress management in aquaculture. © 2017 The Society for Applied Microbiology.

  19. Significant improvement in one-dimensional cursor control using Laplacian electroencephalography over electroencephalography

    Science.gov (United States)

    Boudria, Yacine; Feltane, Amal; Besio, Walter

    2014-06-01

    Objective. Brain-computer interfaces (BCIs) based on electroencephalography (EEG) have been shown to accurately detect mental activities, but the acquisition of high levels of control require extensive user training. Furthermore, EEG has low signal-to-noise ratio and low spatial resolution. The objective of the present study was to compare the accuracy between two types of BCIs during the first recording session. EEG and tripolar concentric ring electrode (TCRE) EEG (tEEG) brain signals were recorded and used to control one-dimensional cursor movements. Approach. Eight human subjects were asked to imagine either ‘left’ or ‘right’ hand movement during one recording session to control the computer cursor using TCRE and disc electrodes. Main results. The obtained results show a significant improvement in accuracies using TCREs (44%-100%) compared to disc electrodes (30%-86%). Significance. This study developed the first tEEG-based BCI system for real-time one-dimensional cursor movements and showed high accuracies with little training.

  20. Induction-heating MOCVD reactor with significantly improved heating efficiency and reduced harmful magnetic coupling

    KAUST Repository

    Li, Kuang-Hui; Alotaibi, Hamad S.; Sun, Haiding; Lin, Ronghui; Guo, Wenzhe; Torres-Castanedo, Carlos G.; Liu, Kaikai; Galan, Sergio V.; Li, Xiaohang

    2018-01-01

    In a conventional induction-heating III-nitride metalorganic chemical vapor deposition (MOCVD) reactor, the induction coil is outside the chamber. Therefore, the magnetic field does not couple with the susceptor well, leading to compromised heating efficiency and harmful coupling with the gas inlet and thus possible overheating. Hence, the gas inlet has to be at a minimum distance away from the susceptor. Because of the elongated flow path, premature reactions can be more severe, particularly between Al- and B-containing precursors and NH3. Here, we propose a structure that can significantly improve the heating efficiency and allow the gas inlet to be closer to the susceptor. Specifically, the induction coil is designed to surround the vertical cylinder of a T-shaped susceptor comprising the cylinder and a top horizontal plate holding the wafer substrate within the reactor. Therefore, the cylinder coupled most magnetic field to serve as the thermal source for the plate. Furthermore, the plate can block and thus significantly reduce the uncoupled magnetic field above the susceptor, thereby allowing the gas inlet to be closer. The results show approximately 140% and 2.6 times increase in the heating and susceptor coupling efficiencies, respectively, as well as a 90% reduction in the harmful magnetic flux on the gas inlet.

  1. Induction-heating MOCVD reactor with significantly improved heating efficiency and reduced harmful magnetic coupling

    KAUST Repository

    Li, Kuang-Hui

    2018-02-23

    In a conventional induction-heating III-nitride metalorganic chemical vapor deposition (MOCVD) reactor, the induction coil is outside the chamber. Therefore, the magnetic field does not couple with the susceptor well, leading to compromised heating efficiency and harmful coupling with the gas inlet and thus possible overheating. Hence, the gas inlet has to be at a minimum distance away from the susceptor. Because of the elongated flow path, premature reactions can be more severe, particularly between Al- and B-containing precursors and NH3. Here, we propose a structure that can significantly improve the heating efficiency and allow the gas inlet to be closer to the susceptor. Specifically, the induction coil is designed to surround the vertical cylinder of a T-shaped susceptor comprising the cylinder and a top horizontal plate holding the wafer substrate within the reactor. Therefore, the cylinder coupled most magnetic field to serve as the thermal source for the plate. Furthermore, the plate can block and thus significantly reduce the uncoupled magnetic field above the susceptor, thereby allowing the gas inlet to be closer. The results show approximately 140% and 2.6 times increase in the heating and susceptor coupling efficiencies, respectively, as well as a 90% reduction in the harmful magnetic flux on the gas inlet.

  2. Nicotine Significantly Improves Chronic Stress-Induced Impairments of Cognition and Synaptic Plasticity in Mice.

    Science.gov (United States)

    Shang, Xueliang; Shang, Yingchun; Fu, Jingxuan; Zhang, Tao

    2017-08-01

    The aim of this study was to examine if nicotine was able to improve cognition deficits in a mouse model of chronic mild stress. Twenty-four male C57BL/6 mice were divided into three groups: control, stress, and stress with nicotine treatment. The animal model was established by combining chronic unpredictable mild stress (CUMS) and isolated feeding. Mice were exposed to CUMS continued for 28 days, while nicotine (0.2 mg/kg) was also administrated for 28 days. Weight and sucrose consumption were measured during model establishing period. The anxiety and behavioral despair were analyzed using the forced swim test (FST) and open-field test (OFT). Spatial cognition was evaluated using Morris water maze (MWM) test. Following behavioral assessment, both long-term potentiation (LTP) and depotentiation (DEP) were recorded in the hippocampal dentate gyrus (DG) region. Both synaptic and Notch1 proteins were measured by Western. Nicotine increased stressed mouse's sucrose consumption. The MWM test showed that spatial learning and reversal learning in stressed animals were remarkably affected relative to controls, whereas nicotine partially rescued cognitive functions. Additionally, nicotine considerably alleviated the level of anxiety and the degree of behavioral despair in stressed mice. It effectively mitigated the depression-induced impairment of hippocampal synaptic plasticity, in which both the LTP and DEP were significantly inhibited in stressed mice. Moreover, nicotine enhanced the expression of synaptic and Notch1 proteins in stressed animals. The results suggest that nicotine ameliorates the depression-like symptoms and improves the hippocampal synaptic plasticity closely associated with activating transmembrane ion channel receptors and Notch signaling components. Graphical Abstract ᅟ.

  3. Significant improvements in stability and reproducibility of atomic-scale atomic force microscopy in liquid

    International Nuclear Information System (INIS)

    Akrami, S M R; Nakayachi, H; Fukuma, T; Watanabe-Nakayama, T; Asakawa, H

    2014-01-01

    Recent advancement of dynamic-mode atomic force microscopy (AFM) for liquid-environment applications enabled atomic-scale studies on various interfacial phenomena. However, instabilities and poor reproducibility of the measurements often prevent systematic studies. To solve this problem, we have investigated the effect of various tip treatment methods for atomic-scale imaging and force measurements in liquid. The tested methods include Si coating, Ar plasma, Ar sputtering and UV/O 3 cleaning. We found that all the methods provide significant improvements in both the imaging and force measurements in spite of the tip transfer through the air. Among the methods, we found that the Si coating provides the best stability and reproducibility in the measurements. To understand the origin of the fouling resistance of the cleaned tip surface and the difference between the cleaning methods, we have investigated the tip surface properties by x-ray photoelectron spectroscopy and contact angle measurements. The results show that the contaminations adsorbed on the tip during the tip transfer through the air should desorb from the surface when it is immersed in aqueous solution due to the enhanced hydrophilicity by the tip treatments. The tip surface prepared by the Si coating is oxidized when it is immersed in aqueous solution. This creates local spots where stable hydration structures are formed. For the other methods, there is no active mechanism to create such local hydration sites. Thus, the hydration structure formed under the tip apex is not necessarily stable. These results reveal the desirable tip properties for atomic-scale AFM measurements in liquid, which should serve as a guideline for further improvements of the tip treatment methods. (paper)

  4. Significant improvement of accuracy and precision in the determination of trace rare earths by fluorescence analysis

    International Nuclear Information System (INIS)

    Ozawa, L.; Hersh, H.N.

    1976-01-01

    Most of the rare earths in yttrium, gadolinium and lanthanum oxides emit characteristic fluorescent line spectra under irradiation with photons, electrons and x rays. The sensitivity and selectivity of the rare earth fluorescences are high enough to determine the trace amounts (0.01 to 100 ppM) of rare earths. The absolute fluorescent intensities of solids, however, are markedly affected by the synthesis procedure, level of contamination and crystal perfection, resulting in poor accuracy and low precision for the method (larger than 50 percent error). Special care in preparation of the samples is required to obtain good accuracy and precision. It is found that the accuracy and precision for the determination of trace (less than 10 ppM) rare earths by fluorescence analysis improved significantly, while still maintaining the sensitivity, when the determination is made by comparing the ratio of the fluorescent intensities of the trace rare earths to that of a deliberately added rare earth as reference. The variation in the absolute fluorescent intensity remains, but is compensated for by measuring the fluorescent line intensity ratio. Consequently, the determination of trace rare earths (with less than 3 percent error) is easily made by a photoluminescence technique in which the rare earths are excited directly by photons. Accuracy is still maintained when the absolute fluorescent intensity is reduced by 50 percent through contamination by Ni, Fe, Mn or Pb (about 100 ppM). Determination accuracy is also improved for fluorescence analysis by electron excitation and x-ray excitation. For some rare earths, however, accuracy by these techniques is reduced because indirect excitation mechanisms are involved. The excitation mechanisms and the interferences between rare earths are also reported

  5. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  6. An Improvement of the Hotelling T2 Statistic in Monitoring Multivariate Quality Characteristics

    Directory of Open Access Journals (Sweden)

    Ashkan Shabbak

    2012-01-01

    Full Text Available The Hotelling T2 statistic is the most popular statistic used in multivariate control charts to monitor multiple qualities. However, this statistic is easily affected by the existence of more than one outlier in the data set. To rectify this problem, robust control charts, which are based on the minimum volume ellipsoid and the minimum covariance determinant, have been proposed. Most researchers assess the performance of multivariate control charts based on the number of signals without paying much attention to whether those signals are really outliers. With due respect, we propose to evaluate control charts not only based on the number of detected outliers but also with respect to their correct positions. In this paper, an Upper Control Limit based on the median and the median absolute deviation is also proposed. The results of this study signify that the proposed Upper Control Limit improves the detection of correct outliers but that it suffers from a swamping effect when the positions of outliers are not taken into consideration. Finally, a robust control chart based on the diagnostic robust generalised potential procedure is introduced to remedy this drawback.

  7. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    Science.gov (United States)

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Flavonol-rich dark cocoa significantly decreases plasma endothelin-1 and improves cognition in urban children.

    Science.gov (United States)

    Calderón-Garcidueñas, Lilian; Mora-Tiscareño, Antonieta; Franco-Lira, Maricela; Cross, Janet V; Engle, Randall; Aragón-Flores, Mariana; Gómez-Garza, Gilberto; Jewells, Valerie; Medina-Cortina, Humberto; Solorio, Edelmira; Chao, Chih-Kai; Zhu, Hongtu; Mukherjee, Partha S; Ferreira-Azevedo, Lara; Torres-Jardón, Ricardo; D'Angiulli, Amedeo

    2013-01-01

    Air pollution exposures are linked to systemic inflammation, cardiovascular and respiratory morbidity and mortality, neuroinflammation and neuropathology in young urbanites. In particular, most Mexico City Metropolitan Area (MCMA) children exhibit subtle cognitive deficits, and neuropathology studies show 40% of them exhibiting frontal tau hyperphosphorylation and 51% amyloid-β diffuse plaques (compared to 0% in low pollution control children). We assessed whether a short cocoa intervention can be effective in decreasing plasma endothelin 1 (ET-1) and/or inflammatory mediators in MCMA children. Thirty gram of dark cocoa with 680 mg of total flavonols were given daily for 10.11 ± 3.4 days (range 9-24 days) to 18 children (10.55 years, SD = 1.45; 11F/7M). Key metabolite ratios in frontal white matter and in hippocampus pre and during cocoa intervention were quantified by magnetic resonance spectroscopy. ET-1 significantly decreased after cocoa treatment (p = 0.0002). Fifteen children (83%) showed a marginally significant individual improvement in one or both of the applied simple short memory tasks. Endothelial dysfunction is a key feature of exposure to particulate matter (PM) and decreased endothelin-1 bioavailability is likely useful for brain function in the context of air pollution. Our findings suggest that cocoa interventions may be critical for early implementation of neuroprotection of highly exposed urban children. Multi-domain nutraceutical interventions could limit the risk for endothelial dysfunction, cerebral hypoperfusion, neuroinflammation, cognitive deficits, structural volumetric detrimental brain effects, and the early development of the neuropathological hallmarks of Alzheimer's and Parkinson's diseases.

  10. Model training across multiple breeding cycles significantly improves genomic prediction accuracy in rye (Secale cereale L.).

    Science.gov (United States)

    Auinger, Hans-Jürgen; Schönleben, Manfred; Lehermeier, Christina; Schmidt, Malthe; Korzun, Viktor; Geiger, Hartwig H; Piepho, Hans-Peter; Gordillo, Andres; Wilde, Peer; Bauer, Eva; Schön, Chris-Carolin

    2016-11-01

    Genomic prediction accuracy can be significantly increased by model calibration across multiple breeding cycles as long as selection cycles are connected by common ancestors. In hybrid rye breeding, application of genome-based prediction is expected to increase selection gain because of long selection cycles in population improvement and development of hybrid components. Essentially two prediction scenarios arise: (1) prediction of the genetic value of lines from the same breeding cycle in which model training is performed and (2) prediction of lines from subsequent cycles. It is the latter from which a reduction in cycle length and consequently the strongest impact on selection gain is expected. We empirically investigated genome-based prediction of grain yield, plant height and thousand kernel weight within and across four selection cycles of a hybrid rye breeding program. Prediction performance was assessed using genomic and pedigree-based best linear unbiased prediction (GBLUP and PBLUP). A total of 1040 S 2 lines were genotyped with 16 k SNPs and each year testcrosses of 260 S 2 lines were phenotyped in seven or eight locations. The performance gap between GBLUP and PBLUP increased significantly for all traits when model calibration was performed on aggregated data from several cycles. Prediction accuracies obtained from cross-validation were in the order of 0.70 for all traits when data from all cycles (N CS  = 832) were used for model training and exceeded within-cycle accuracies in all cases. As long as selection cycles are connected by a sufficient number of common ancestors and prediction accuracy has not reached a plateau when increasing sample size, aggregating data from several preceding cycles is recommended for predicting genetic values in subsequent cycles despite decreasing relatedness over time.

  11. Significantly improving trace thallium removal from surface waters during coagulation enhanced by nanosized manganese dioxide.

    Science.gov (United States)

    Huangfu, Xiaoliu; Ma, Chengxue; Ma, Jun; He, Qiang; Yang, Chun; Jiang, Jin; Wang, Yaan; Wu, Zhengsong

    2017-02-01

    Thallium (Tl) is an element of high toxicity and significant accumulation in human body. There is an urgent need for the development of appropriate strategies for trace Tl removal in drinking water treatment plants. In this study, the efficiency and mechanism of trace Tl (0.5 μg/L) removal by conventional coagulation enhanced by nanosized manganese dioxide (nMnO 2 ) were explored in simulated water and two representative surface waters (a river water and a reservoir water obtained from Northeast China). Experimental results showed that nMnO 2 significantly improve Tl(I) removal from selected waters. The removal efficiency was dramatically higher in the simulated water, demonstrating by less than 0.1 μg/L Tl residual. The enhancement of trace Tl removal in the surface waters decreased to a certain extent. Both adjusting water pH to alkaline condition and preoxidation of Tl(I) to Tl(III) benefit trace Tl removal from surface waters. Data also indicated that competitive cation of Ca 2+ decreased the efficiency of trace Tl removal, resulting from the reduction of Tl adsorption on nMnO 2 . Humic acid could largely low Tl removal efficiency during nMnO 2 enhanced coagulation processes. Trace elemental Tl firstly adsorbed on nMnO 2 and then removed accompanying with nMnO 2 settling. The information obtained in the present study may provide a potential strategy for drinking water treatment plants threatened by trace Tl. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    Science.gov (United States)

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  13. Significant effect of Ca2+ on improving the heat resistance of lactic acid bacteria.

    Science.gov (United States)

    Huang, Song; Chen, Xiao Dong

    2013-07-01

    The heat resistance of lactic acid bacteria (LAB) has been extensively investigated due to its highly practical significance. Reconstituted skim milk (RSM) has been found to be one of the most effective protectant wall materials for microencapsulating microorganisms during convective drying, such as spray drying. In addition to proteins and carbohydrate, RSM is rich in calcium. It is not clear which component is critical in the RSM protection mechanism. This study investigated the independent effect of calcium. Ca(2+) was added to lactose solution to examine its influence on the heat resistance of Lactobacillus rhamnosus ZY, Lactobacillus casei Zhang, Lactobacillus plantarum P8 and Streptococcus thermophilus ND03. The results showed that certain Ca(2+) concentrations enhanced the heat resistance of the LAB strains to different extents, that is produced higher survival and shorter regrowth lag times of the bacterial cells. In some cases, the improvements were dramatic. More scientifically insightful and more intensive instrumental study of the Ca(2+) behavior around and in the cells should be carried out in the near future. In the meantime, this work may lead to the development of more cost-effective wall materials with Ca(2+) added as a prime factor. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  14. Targeting Heparin to Collagen within Extracellular Matrix Significantly Reduces Thrombogenicity and Improves Endothelialization of Decellularized Tissues.

    Science.gov (United States)

    Jiang, Bin; Suen, Rachel; Wertheim, Jason A; Ameer, Guillermo A

    2016-12-12

    Thrombosis within small-diameter vascular grafts limits the development of bioartificial, engineered vascular conduits, especially those derived from extracellular matrix (ECM). Here we describe an easy-to-implement strategy to chemically modify vascular ECM by covalently linking a collagen binding peptide (CBP) to heparin to form a heparin derivative (CBP-heparin) that selectively binds a subset of collagens. Modification of ECM with CBP-heparin leads to increased deposition of functional heparin (by ∼7.2-fold measured by glycosaminoglycan composition) and a corresponding reduction in platelet binding (>70%) and whole blood clotting (>80%) onto the ECM. Furthermore, addition of CBP-heparin to the ECM stabilizes long-term endothelial cell attachment to the lumen of ECM-derived vascular conduits, potentially through recruitment of heparin-binding growth factors that ultimately improve the durability of endothelialization in vitro. Overall, our findings provide a simple yet effective method to increase deposition of functional heparin on the surface of ECM-based vascular grafts and thereby minimize thrombogenicity of decellularized tissue, overcoming a significant challenge in tissue engineering of bioartificial vessels and vascularized organs.

  15. Significant Improvement in Chronic Persistent Headaches Caused by Small Rathke Cleft Cysts After Transsphenoidal Surgery.

    Science.gov (United States)

    Fukui, Issei; Hayashi, Yasuhiko; Kita, Daisuke; Sasagawa, Yasuo; Oishi, Masahiro; Tachibana, Osamu; Nakada, Mitsutoshi

    2017-03-01

    Rathke cleft cysts (RCC) usually are asymptomatic and can be observed via the use of conservative methods. Some patients with RCCs, however, have severe headaches even if they are small enough to be confined to the sella, and these small RCCs seldom have been discussed. This study presents an investigation into clinical characteristics of small RCCs associated with severe headaches, demonstrating efficacy and safety of endoscopic transsphenoidal surgery (ETSS) to relieve headaches. In this study, 13 patients with small RCCs (maximum diameter HIT-6) score was calculated both pre- and postoperatively to evaluate headache severity. All patients complained of severe headaches, which disturbed their daily life. Most headaches were nonpulsating and localized in the frontal area. Characteristically, 6 patients (46%) experienced severe headaches with sudden onset that continued chronically. HIT-6 score was 64 on average, meaning headaches affected daily life severely. After surgical decompression of the cyst, headache in all of the patients improved dramatically and HIT-6 score decreased significantly to 37, suggesting that headaches were diminished. No newly developed deficiencies of the anterior pituitary lobe function were detected. Postoperative occurrence of diabetes insipidus was found in 2 patients, both of which were transient. No recurring cysts were found. Severe headaches can develop from small RCCs. In the present study, ETSS was performed on such patients effectively and safely to relieve their headaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a

  17. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Directory of Open Access Journals (Sweden)

    Patrick Wessa

    Full Text Available BACKGROUND: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses, which required us to develop a specific-purpose Statistical Learning Environment (SLE based on Reproducible Computing and newly developed Peer Review (PR technology. OBJECTIVES: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. METHODS: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. RESULTS: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student

  18. Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under

  19. Combining super-ensembles and statistical emulation to improve a regional climate and vegetation model

    Science.gov (United States)

    Hawkins, L. R.; Rupp, D. E.; Li, S.; Sarah, S.; McNeall, D. J.; Mote, P.; Betts, R. A.; Wallom, D.

    2017-12-01

    Changing regional patterns of surface temperature, precipitation, and humidity may cause ecosystem-scale changes in vegetation, altering the distribution of trees, shrubs, and grasses. A changing vegetation distribution, in turn, alters the albedo, latent heat flux, and carbon exchanged with the atmosphere with resulting feedbacks onto the regional climate. However, a wide range of earth-system processes that affect the carbon, energy, and hydrologic cycles occur at sub grid scales in climate models and must be parameterized. The appropriate parameter values in such parameterizations are often poorly constrained, leading to uncertainty in predictions of how the ecosystem will respond to changes in forcing. To better understand the sensitivity of regional climate to parameter selection and to improve regional climate and vegetation simulations, we used a large perturbed physics ensemble and a suite of statistical emulators. We dynamically downscaled a super-ensemble (multiple parameter sets and multiple initial conditions) of global climate simulations using a 25-km resolution regional climate model HadRM3p with the land-surface scheme MOSES2 and dynamic vegetation module TRIFFID. We simultaneously perturbed land surface parameters relating to the exchange of carbon, water, and energy between the land surface and atmosphere in a large super-ensemble of regional climate simulations over the western US. Statistical emulation was used as a computationally cost-effective tool to explore uncertainties in interactions. Regions of parameter space that did not satisfy observational constraints were eliminated and an ensemble of parameter sets that reduce regional biases and span a range of plausible interactions among earth system processes were selected. This study demonstrated that by combining super-ensemble simulations with statistical emulation, simulations of regional climate could be improved while simultaneously accounting for a range of plausible land

  20. Successful percutaneous coronary intervention significantly improves coronary sinus blood flow as assessed by transthoracic echocardiography.

    Science.gov (United States)

    Lyubarova, Radmila; Boden, William E; Fein, Steven A; Schulman-Marcus, Joshua; Torosoff, Mikhail

    2018-06-01

    Transthoracic echocardiography (TTE) has been used to assess coronary sinus blood flow (CSBF), which reflects total coronary arterial blood flow. Successful angioplasty is expected to improve coronary arterial blood flow. Changes in CSBF after percutaneous coronary intervention (PCI), as assessed by TTE, have not been systematically evaluated. TTE can be utilized to reflect increased CSBF after a successful, clinically indicated PCI. The study cohort included 31 patients (18 females, 62 ± 11 years old) referred for diagnostic cardiac catheterization for suspected coronary artery disease and possible PCI, when clinically indicated. All performed PCIs were successful, with good angiographic outcome. CSBF per cardiac cycle (mL/beat) was measured using transthoracic two-dimensional and Doppler flow imaging as the product of coronary sinus (CS) area and CS flow time-velocity integral. CSBF per minute (mL/min) was calculated as the product of heart rate and CSBF per cardiac cycle. In each patient, CSBF was assessed prospectively, before and after cardiac catheterization with and without clinically indicated PCI. Within- and between-group differences in CSBF before and after PCI were assessed using repeated measures analysis of variance. Technically adequate CSBF measurements were obtained in 24 patients (77%). In patients who did not undergo PCI, there was no significant change in CSBF (278.1 ± 344.1 versus 342.7 ± 248.5, p = 0.36). By contrast, among patients who underwent PCI, CSBF increased significantly (254.3 ± 194.7 versus 618.3 ± 358.5 mL/min, p < 0.01, p-interaction = 0.03). Other hemodynamic and echocardiographic parameters did not change significantly before and after cardiac catheterization in either treatment group. Transthoracic echocardiographic assessment can be employed to document CSBF changes after angioplasty. Future studies are needed to explore the clinical utility of this noninvasive metric.

  1. High-intensity interval training (swimming) significantly improves the adverse metabolism and comorbidities in diet-induced obese mice.

    Science.gov (United States)

    Motta, Victor F; Aguila, Marcia B; Mandarim-DE-Lacerda, Carlos A

    2016-05-01

    Controlling obesity and other comorbidities in the population is a challenge in modern society. High-intensity interval training (HIIT) combines short periods of high-intensity exercise with long recovery periods or a low-intensity exercise. The aim was to assess the impact of HIIT in the context of diet-induced obesity in the animal model. C57BL/6 mice were fed one of the two diets: standard chow (lean group [LE]) or a high-fat diet (obese group [OB]). After twelve weeks, the animals were divided into non-trained groups (LE-NT and OB-NT) and trained groups (LE-T and OB-T), and began an exercise protocol. For biochemical analysis of inflammatory and lipid profile, we used a colorimetric enzymatic method and an automatic spectrophotometer. One-way ANOVA was used for statistical analysis of the experimental groups with Holm-Sidak post-hoc Test. Two-way ANOVA analyzed the interactions between diet and HIIT protocol. HIIT leads to significant reductions in body mass, blood glucose, glucose tolerance and hepatic lipid profile in T-groups compared to NT-groups. HIIT was able to reduce plasma levels of inflammatory cytokines. Additionally, HIIT improves the insulin immunodensity in the islets, reduces the adiposity and the hepatic steatosis in the T-groups. HIIT improves beta-oxidation and peroxisome proliferator-activated receptor (PPAR)-alpha and reduces lipogenesis and PPAR-gamma levels in the liver. In skeletal muscle, HIIT improves PPAR-alpha and glucose transporter-4 and reduces PPAR-gamma levels. HIIT leads to attenuate the adverse effects caused by a chronic ingestion of a high-fat diet.

  2. Heat storage in forest biomass significantly improves energy balance closure particularly during stable conditions

    Science.gov (United States)

    Lindroth, A.; Mölder, M.; Lagergren, F.

    2009-08-01

    Temperature measurements in trunks and branches in a mature ca. 100 years-old mixed pine and spruce forest in central Sweden were used to estimate the heat storage in the tree biomass. The estimated heat flux in the sample trees and data on biomass distributions were used to scale up to stand level biomass heat fluxes. The rate of change of sensible and latent heat storage in the air layer below the level of the flux measurements was estimated from air temperature and humidity profile measurements and soil heat flux was estimated from heat flux plates and soil temperature measurements. The fluxes of sensible and latent heat from the forest were measured with an eddy covariance system in a tower. The analysis was made for a two-month period in summer of 1995. The tree biomass heat flux was the largest of the estimated storage components and varied between 40 and -35 W m-2 on summer days with nice weather. Averaged over two months the diurnal maximum of total heat storage was 45 W m-2 and the minimum was -35 W m-2. The soil heat flux and the sensible heat storage in air were out of phase with the biomass flux and they reached maximum values that were about 75% of the maximum of the tree biomass heat storage. The energy balance closure improved significantly when the total heat storage was added to the turbulent fluxes. The slope of a regression line with sum of fluxes and storage as independent and net radiation as dependent variable, increased from 0.86 to 0.95 for half-hourly data and the scatter was also reduced. The most significant finding was, however, that during nights with strongly stable conditions when the sensible heat flux dropped to nearly zero, the total storage matched the net radiation nearly perfectly. Another interesting result was that the mean energy imbalance started to increase when the Richardson number became more negative than ca. -0.1. In fact, the largest energy deficit occurred at maximum instability. Our conclusion is that eddy

  3. Significant improvement of eczema with skin care and food elimination in small children.

    Science.gov (United States)

    Norrman, Gunilla; Tomicić, Sara; Böttcher, Malin Fagerås; Oldaeus, Göran; Strömberg, Leif; Fälth-magnusson, Karin

    2005-10-01

    To evaluate common methods of investigation and treatment in children younger than 2 y of age with eczema, with or without sensitization to food allergens. One hundred and twenty-three children younger than 2 y of age with eczema and suspected food allergy were included in this prospective study. The children underwent skin-prick test with cow's milk, fresh hen's egg white and wheat. Specific IgE to milk and egg white was analysed. The eczema extent and severity was estimated with SCORAD before and after treatment. Children with a positive skin-prick test were instructed to exclude that food item from their diet. All children were treated with emollients and topical steroids when needed. Sixty-two of the children were skin-prick positive to at least one of the allergens; 62% had mild, 30% moderate and 8% severe eczema at their first visit. After treatment, 90% had mild, 10% moderate and 0% severe eczema. Forty-six per cent of the children had circulating IgE antibodies to milk or egg white. Ten per cent had specific IgE but negative skin-prick test to the same allergen. This subgroup improved their eczema significantly without elimination diet. The conventional treatments for children with eczema, i.e. skin care and food elimination, are effective. The beneficial effect of skin care as the first step should not be neglected, and it may not be necessary to eliminate food allergens to relieve skin symptoms in all food-sensitized children with eczema.

  4. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions.

    Directory of Open Access Journals (Sweden)

    Tomislav Hengl

    Full Text Available 80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS project was established in 2008. Over the period 2008-2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management--organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na. We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15-75% in Root Mean Squared Error (RMSE across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring

  5. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    Science.gov (United States)

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  6. Cyclosporin A significantly improves preeclampsia signs and suppresses inflammation in a rat model.

    Science.gov (United States)

    Hu, Bihui; Yang, Jinying; Huang, Qian; Bao, Junjie; Brennecke, Shaun Patrick; Liu, Huishu

    2016-05-01

    Preeclampsia is associated with an increased inflammatory response. Immune suppression might be an effective treatment. The aim of this study was to examine whether Cyclosporin A (CsA), an immunosuppressant, improves clinical characteristics of preeclampsia and suppresses inflammation in a lipopolysaccharide (LPS) induced preeclampsia rat model. Pregnant rats were randomly divided into 4 groups: group 1 (PE) rats each received LPS via tail vein on gestational day (GD) 14; group 2 (PE+CsA5) rats were pretreated with LPS (1.0 μg/kg) on GD 14 and were then treated with CsA (5mg/kg, ip) on GDs 16, 17 and 18; group 3 (PE+CsA10) rats were pretreated with LPS (1.0 μg/kg) on GD 14 and were then treated with CsA (10mg/kg, ip) on GDs 16, 17 and 18; group 4 (pregnant control, PC) rats were treated with the vehicle (saline) used for groups 1, 2 and 3. Systolic blood pressure, urinary albumin, biometric parameters and the levels of serum cytokines were measured on day 20. CsA treatment significantly reduced LPS-induced systolic blood pressure and the mean 24-h urinary albumin excretion. Pro-inflammatory cytokines IL-6, IL-17, IFN-γ and TNF-α were increased in the LPS treatment group but were reduced in (LPS+CsA) group (Ppreeclampsia signs and attenuated inflammatory responses in the LPS induced preeclampsia rat model which suggests that immunosuppressant might be an alternative management option for preeclampsia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  8. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    Science.gov (United States)

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to

  9. Correcting the Count: Improving Vital Statistics Data Regarding Deaths Related to Obesity.

    Science.gov (United States)

    McCleskey, Brandi C; Davis, Gregory G; Dye, Daniel W

    2017-11-15

    Obesity can involve any organ system and compromise the overall health of an individual, including premature death. Despite the increased risk of death associated with being obese, obesity itself is infrequently indicated on the death certificate. We performed an audit of our records to identify how often "obesity" was listed on the death certificate to determine how our practices affected national mortality data collection regarding obesity-related mortality. During the span of nearly 25 years, 0.2% of deaths were attributed to or contributed by obesity. Over the course of 5 years, 96% of selected natural deaths were likely underreported as being associated with obesity. We present an algorithm for certifiers to use to determine whether obesity should be listed on the death certificate and guidelines for certifying cases in which this is appropriate. Use of this algorithm will improve vital statistics concerning the role of obesity in causing or contributing to death. © 2017 American Academy of Forensic Sciences.

  10. Contemporary Management of Acute Aortic Occlusion Has Evolved but Outcomes Have Not Significantly Improved.

    Science.gov (United States)

    Robinson, William P; Patel, Rupal K; Columbo, Jesse A; Flahive, Julie; Aiello, Francesco A; Baril, Donald T; Schanzer, Andres; Messina, Louis M

    2016-07-01

    hospitalization. AAO is now more commonly caused by in situ thrombosis rather than embolism. A high index of suspicion for AAO is required for prompt diagnosis and treatment, particularly when patients present with profound lower extremity neurologic deficit. In comparison with previous reports, the contemporary management of AAO includes increased use of axillobifemoral bypass and now involves endovascular revascularization, although a variety of open surgical procedures are utilized. However, the in-hospital mortality and morbidity of AAO has not decreased significantly over the last 2 decades and mid-term survival remains limited. Further study is required to identify strategies that improve outcomes after AAO. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Efficiency improvement of nuclear power plant operation: the significant role of advanced nuclear fuel technologies

    International Nuclear Information System (INIS)

    Van Velde, AA. de; Burtak, F.

    2000-01-01

    In this paper authors deals with nuclear fuel cycle and their economic aspects. At Siemens, the developments focusing on the reduction of fuel cycle costs are currently directed on .further batch average burnup increase, .improvement of fuel reliability, .enlargement of fuel operation margins, .improvement of methods for fuel design and core analysis. These items will be presented in detail in the full paper and illustrated by the global operating experience of Siemens fuel for both PWRs and BWRs. (authors)

  12. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    Science.gov (United States)

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  13. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    Science.gov (United States)

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  14. Improving Quality in Teaching Statistics Concepts Using Modern Visualization: The Design and Use of the Flash Application on Pocket PCs

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Pei-Yu

    2009-01-01

    The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…

  15. Millisecond photo-thermal process on significant improvement of supercapacitor’s performance

    International Nuclear Information System (INIS)

    Wang, Kui; Wang, Jixiao; Wu, Ying; Zhao, Song; Wang, Zhi; Wang, Shichang

    2016-01-01

    Graphical abstract: A high way for charge transfer is created by a millisecond photo-thermal process which could decrease contact resistance among nanomaterials and improve the electrochemical performances. - Highlights: • Improve conductivity among nanomaterials with a millisecond photo-thermal process. • The specific capacitance can increase about 25% with an photo-thermal process. • The circle stability and rate capability can be improved above 10% with photo-thermal process. • Provide a new way that create electron path to improve electrochemical performance. - Abstract: Supercapacitors fabricated with nanomaterials usually have high specific capacitance and excellent performance. However, the small size of nanomaterials renders a considerable limitation of the contact area among nanomaterials, which is harmful to charge carrier transfer. This fact may hinder the development and application of nanomaterials in electrochemical storage systems. Here, a millisecond photo-thermal process was introduced to create a charge carries transfer path to decrease the contact resistance among nanomaterials, and enhance the electrochemical performance of supercapacitors. Polyaniline (PANI) nanowire, as a model nanomaterial, was used to modify electrodes under different photo-thermal process conditions. The modified electrodes were characterized by scanning electronic microscopy (SEM), cyclic voltammetry (CV), electrochemical impedance spectroscopy (EIS) and the results were analysed by equivalent circuit simulation. These results demonstrate that the photo-thermal process can alter the morphology of PANI nanowires, lower the charge transfer resistances and thus improve the performance of electrodes. The specific capacitance increase of the modified electrodes is about 25%. The improvement of the circle stability and rate capability are above 10%. To the best of our knowledge, this is the first attempt on research the effect of photo-thermal process on the conductivity

  16. A Review of New and Developing Technology to Significantly Improve Mars Sample-Return Missions

    Science.gov (United States)

    Carsey, F.; Brophy, J.; Gilmore, M.; Rodgers, D.; Wilcox, B.

    2000-07-01

    A JPL development activity was initiated in FY 1999 for the purpose of examining and evaluating technologies that could materially improve future (i.e., beyond the 2005 launch) Mars sample return missions. The scope of the technology review was comprehensive and end-to-end; the goal was to improve mass, cost, risk, and scientific return. A specific objective was to assess approaches to sample return with only one Earth launch. While the objective of the study was specifically for sample-return, in-situ missions can also benefit from using many of the technologies examined.

  17. Implementation of novel statistical procedures and other advanced approaches to improve analysis of CASA data.

    Science.gov (United States)

    Ramón, M; Martínez-Pastor, F

    2018-04-23

    Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.

  18. Improvement of vertical velocity statistics measured by a Doppler lidar through comparison with sonic anemometer observations

    Science.gov (United States)

    Bonin, Timothy A.; Newman, Jennifer F.; Klein, Petra M.; Chilson, Phillip B.; Wharton, Sonia

    2016-12-01

    Since turbulence measurements from Doppler lidars are being increasingly used within wind energy and boundary-layer meteorology, it is important to assess and improve the accuracy of these observations. While turbulent quantities are measured by Doppler lidars in several different ways, the simplest and most frequently used statistic is vertical velocity variance (w'2) from zenith stares. However, the competing effects of signal noise and resolution volume limitations, which respectively increase and decrease w'2, reduce the accuracy of these measurements. Herein, an established method that utilises the autocovariance of the signal to remove noise is evaluated and its skill in correcting for volume-averaging effects in the calculation of w'2 is also assessed. Additionally, this autocovariance technique is further refined by defining the amount of lag time to use for the most accurate estimates of w'2. Through comparison of observations from two Doppler lidars and sonic anemometers on a 300 m tower, the autocovariance technique is shown to generally improve estimates of w'2. After the autocovariance technique is applied, values of w'2 from the Doppler lidars are generally in close agreement (R2 ≈ 0.95 - 0.98) with those calculated from sonic anemometer measurements.

  19. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration.

    Science.gov (United States)

    de Groot, Marius; Vernooij, Meike W; Klein, Stefan; Ikram, M Arfan; Vos, Frans M; Smith, Stephen M; Niessen, Wiro J; Andersson, Jesper L R

    2013-08-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS establishes spatial correspondence using a combination of nonlinear registration and a "skeleton projection" that may break topological consistency of the transformed brain images. We therefore investigated feasibility of replacing the two-stage registration-projection procedure in TBSS with a single, regularized, high-dimensional registration. To optimize registration parameters and to evaluate registration performance in diffusion MRI, we designed an evaluation framework that uses native space probabilistic tractography for 23 white matter tracts, and quantifies tract similarity across subjects in standard space. We optimized parameters for two registration algorithms on two diffusion datasets of different quality. We investigated reproducibility of the evaluation framework, and of the optimized registration algorithms. Next, we compared registration performance of the regularized registration methods and TBSS. Finally, feasibility and effect of incorporating the improved registration in TBSS were evaluated in an example study. The evaluation framework was highly reproducible for both algorithms (R(2) 0.993; 0.931). The optimal registration parameters depended on the quality of the dataset in a graded and predictable manner. At optimal parameters, both algorithms outperformed the registration of TBSS, showing feasibility of adopting such approaches in TBSS. This was further confirmed in the example experiment. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Performance studies of GooFit on GPUs vs RooFit on CPUs while estimating the statistical significance of a new physical signal

    Science.gov (United States)

    Di Florio, Adriano

    2017-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  1. The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions

    OpenAIRE

    Mirosław Mrozkowiak; Hanna Żukowska

    2015-01-01

    Mrozkowiak Mirosław, Żukowska Hanna. Znaczenie Dobrego Krzesła, jako elementu szkolnego i domowego środowiska ucznia, w profilaktyce zaburzeń statyki postawy ciała = The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions. Journal of Education, Health and Sport. 2015;5(7):179-215. ISSN 2391-8306. DOI 10.5281/zenodo.19832 http://ojs.ukw.edu.pl/index.php/johs/article/view/2015%3B5%287%29%3A179-215 https:...

  2. Statistical control chart and neural network classification for improving human fall detection

    KAUST Repository

    Harrou, Fouzi; Zerrouki, Nabil; Sun, Ying; Houacine, Amrane

    2017-01-01

    This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow's fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.

  3. Statistical control chart and neural network classification for improving human fall detection

    KAUST Repository

    Harrou, Fouzi

    2017-01-05

    This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow\\'s fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.

  4. Cranial CT with adaptive statistical iterative reconstruction: improved image quality with concomitant radiation dose reduction.

    Science.gov (United States)

    Rapalino, O; Kamalian, Shervin; Kamalian, Shahmir; Payabvash, S; Souza, L C S; Zhang, D; Mukta, J; Sahani, D V; Lev, M H; Pomerantz, S R

    2012-04-01

    To safeguard patient health, there is great interest in CT radiation-dose reduction. The purpose of this study was to evaluate the impact of an iterative-reconstruction algorithm, ASIR, on image-quality measures in reduced-dose head CT scans for adult patients. Using a 64-section scanner, we analyzed 100 reduced-dose adult head CT scans at 6 predefined levels of ASIR blended with FBP reconstruction. These scans were compared with 50 CT scans previously obtained at a higher routine dose without ASIR reconstruction. SNR and CNR were computed from Hounsfield unit measurements of normal GM and WM of brain parenchyma. A blinded qualitative analysis was performed in 10 lower-dose CT datasets compared with higher-dose ones without ASIR. Phantom data analysis was also performed. Lower-dose scans without ASIR had significantly lower mean GM and WM SNR (P = .003) and similar GM-WM CNR values compared with higher routine-dose scans. However, at ASIR levels of 20%-40%, there was no statistically significant difference in SNR, and at ASIR levels of ≥60%, the SNR values of the reduced-dose scans were significantly higher (P ASIR levels of ≥40% (P ASIR levels ≥60% (P ASIR in adult head CT scans reduces image noise and increases low-contrast resolution, while allowing lower radiation doses without affecting spatial resolution.

  5. Improved statistical models for limited datasets in uncertainty quantification using stochastic collocation

    Energy Technology Data Exchange (ETDEWEB)

    Alwan, Aravind; Aluru, N.R.

    2013-12-15

    This paper presents a data-driven framework for performing uncertainty quantification (UQ) by choosing a stochastic model that accurately describes the sources of uncertainty in a system. This model is propagated through an appropriate response surface function that approximates the behavior of this system using stochastic collocation. Given a sample of data describing the uncertainty in the inputs, our goal is to estimate a probability density function (PDF) using the kernel moment matching (KMM) method so that this PDF can be used to accurately reproduce statistics like mean and variance of the response surface function. Instead of constraining the PDF to be optimal for a particular response function, we show that we can use the properties of stochastic collocation to make the estimated PDF optimal for a wide variety of response functions. We contrast this method with other traditional procedures that rely on the Maximum Likelihood approach, like kernel density estimation (KDE) and its adaptive modification (AKDE). We argue that this modified KMM method tries to preserve what is known from the given data and is the better approach when the available data is limited in quantity. We test the performance of these methods for both univariate and multivariate density estimation by sampling random datasets from known PDFs and then measuring the accuracy of the estimated PDFs, using the known PDF as a reference. Comparing the output mean and variance estimated with the empirical moments using the raw data sample as well as the actual moments using the known PDF, we show that the KMM method performs better than KDE and AKDE in predicting these moments with greater accuracy. This improvement in accuracy is also demonstrated for the case of UQ in electrostatic and electrothermomechanical microactuators. We show how our framework results in the accurate computation of statistics in micromechanical systems.

  6. Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System

    Science.gov (United States)

    Demirdjian, L.; Zhou, Y.; Huffman, G. J.

    2016-12-01

    This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.

  7. Improved statistical models for limited datasets in uncertainty quantification using stochastic collocation

    International Nuclear Information System (INIS)

    Alwan, Aravind; Aluru, N.R.

    2013-01-01

    This paper presents a data-driven framework for performing uncertainty quantification (UQ) by choosing a stochastic model that accurately describes the sources of uncertainty in a system. This model is propagated through an appropriate response surface function that approximates the behavior of this system using stochastic collocation. Given a sample of data describing the uncertainty in the inputs, our goal is to estimate a probability density function (PDF) using the kernel moment matching (KMM) method so that this PDF can be used to accurately reproduce statistics like mean and variance of the response surface function. Instead of constraining the PDF to be optimal for a particular response function, we show that we can use the properties of stochastic collocation to make the estimated PDF optimal for a wide variety of response functions. We contrast this method with other traditional procedures that rely on the Maximum Likelihood approach, like kernel density estimation (KDE) and its adaptive modification (AKDE). We argue that this modified KMM method tries to preserve what is known from the given data and is the better approach when the available data is limited in quantity. We test the performance of these methods for both univariate and multivariate density estimation by sampling random datasets from known PDFs and then measuring the accuracy of the estimated PDFs, using the known PDF as a reference. Comparing the output mean and variance estimated with the empirical moments using the raw data sample as well as the actual moments using the known PDF, we show that the KMM method performs better than KDE and AKDE in predicting these moments with greater accuracy. This improvement in accuracy is also demonstrated for the case of UQ in electrostatic and electrothermomechanical microactuators. We show how our framework results in the accurate computation of statistics in micromechanical systems

  8. Reducing dysfunctional beliefs about sleep does not significantly improve insomnia in cognitive behavioral therapy.

    Science.gov (United States)

    Okajima, Isa; Nakajima, Shun; Ochi, Moeko; Inoue, Yuichi

    2014-01-01

    The present study examined to examine whether improvement of insomnia is mediated by a reduction in sleep-related dysfunctional beliefs through cognitive behavioral therapy for insomnia. In total, 64 patients with chronic insomnia received cognitive behavioral therapy for insomnia consisting of 6 biweekly individual treatment sessions of 50 minutes in length. Participants were asked to complete the Athens Insomnia Scale and the Dysfunctional Beliefs and Attitudes about Sleep scale both at the baseline and at the end of treatment. The results showed that although cognitive behavioral therapy for insomnia greatly reduced individuals' scores on both scales, the decrease in dysfunctional beliefs and attitudes about sleep with treatment did not seem to mediate improvement in insomnia. The findings suggest that sleep-related dysfunctional beliefs endorsed by patients with chronic insomnia may be attenuated by cognitive behavioral therapy for insomnia, but changes in such beliefs are not likely to play a crucial role in reducing the severity of insomnia.

  9. Reducing Dysfunctional Beliefs about Sleep Does Not Significantly Improve Insomnia in Cognitive Behavioral Therapy

    OpenAIRE

    Okajima, Isa; Nakajima, Shun; Ochi, Moeko; Inoue, Yuichi

    2014-01-01

    The present study examined to examine whether improvement of insomnia is mediated by a reduction in sleep-related dysfunctional beliefs through cognitive behavioral therapy for insomnia. In total, 64 patients with chronic insomnia received cognitive behavioral therapy for insomnia consisting of 6 biweekly individual treatment sessions of 50 minutes in length. Participants were asked to complete the Athens Insomnia Scale and the Dysfunctional Beliefs and Attitudes about Sleep scale both at the...

  10. Significant performance improvement obtained in a wireless mesh network using a beamswitching antenna

    CSIR Research Space (South Africa)

    Lysko, AA

    2012-09-01

    Full Text Available mesh network operated in a fixed 11 Mbps mode. The throughput improvement in multi-hop communication obtained in the presence of an interferer is tenfold, from 0.2 Mbps to 2 Mbps. Index Terms?antenna, smart antenna, wireless mesh network, WMN... efficiency in the communications, and active research and development of new methods and technologies enabling this at the physical layer, including multiple antenna techniques, such as multiple input multiple output (MIMO) and smart antennas...

  11. Application Exercises Improve Transfer of Statistical Knowledge in Real-World Situations

    Science.gov (United States)

    Daniel, Frances; Braasch, Jason L. G.

    2013-01-01

    The present research investigated whether real-world application exercises promoted students' abilities to spontaneously transfer statistical knowledge and to recognize the use of statistics in real-world contexts. Over the course of a semester of psychological statistics, two classes completed multiple application exercises designed to mimic…

  12. Significant Improvements in Pyranometer Nighttime Offsets Using High-Flow DC Ventilation

    Energy Technology Data Exchange (ETDEWEB)

    Kutchenreiter, Mark; Michalski, J.J.; Long, C.N.; Habte, Aron

    2017-05-22

    Accurate solar radiation measurements using pyranometers are required to understand radiative impacts on the Earth's energy budget, solar energy production, and to validate radiative transfer models. Ventilators of pyranometers, which are used to keep the domes clean and dry, also affect instrument thermal offset accuracy. This poster presents a high-level overview of the ventilators for single-black-detector pyranometers and black-and-white pyranometers. For single-black-detector pyranometers with ventilators, high-flow-rate (50-CFM and higher), 12-V DC fans lower the offsets, lower the scatter, and improve the predictability of nighttime offsets compared to lower-flow-rate (35-CFM), 120-V AC fans operated in the same type of environmental setup. Black-and-white pyranometers, which are used to measure diffuse horizontal irradiance, sometimes show minor improvement with DC fan ventilation, but their offsets are always small, usually no more than 1 W/m2, whether AC- or DC-ventilated.

  13. Nitrite addition to acidified sludge significantly improves digestibility, toxic metal removal, dewaterability and pathogen reduction

    Science.gov (United States)

    Du, Fangzhou; Keller, Jürg; Yuan, Zhiguo; Batstone, Damien J.; Freguia, Stefano; Pikaar, Ilje

    2016-12-01

    Sludge management is a major issue for water utilities globally. Poor digestibility and dewaterability are the main factors determining the cost for sludge management, whereas pathogen and toxic metal concentrations limit beneficial reuse. In this study, the effects of low level nitrite addition to acidified sludge to simultaneously enhance digestibility, toxic metal removal, dewaterability and pathogen reduction were investigated. Waste activated sludge (WAS) from a full-scale waste water treatment plant was treated at pH 2 with 10 mg NO2--N/L for 5 h. Biochemical methane potential tests showed an increase in the methane production of 28%, corresponding to an improvement from 247 ± 8 L CH4/kg VS to 317 ± 1 L CH4/kg VS. The enhanced removal of toxic metals further increased the methane production by another 18% to 360 ± 6 L CH4/kg VS (a total increase of 46%). The solids content of dewatered sludge increased from 14.6 ± 1.4% in the control to 18.2 ± 0.8%. A 4-log reduction for both total coliforms and E. coli was achieved. Overall, this study highlights the potential of acidification with low level nitrite addition as an effective and simple method achieving multiple improvements in terms of sludge management.

  14. Efficiency improvement of nuclear power plant operation: the significant role of advanced nuclear fuel technologies

    International Nuclear Information System (INIS)

    Velde Van de, A.; Burtak, F.

    2001-01-01

    Due to the increased liberalisation of the power markets, nuclear power generation is being exposed to high cost reduction pressure. In this paper we highlight the role of advanced nuclear fuel technologies to reduce the fuel cycle costs and therefore increase the efficiency of nuclear power plant operation. The key factor is a more efficient utilisation of the fuel and present developments at Siemens are consequently directed at (i) further increase of batch average burnup, (ii) improvement of fuel reliability, (iii) enlargement of fuel operation margins and (iv) improvement of methods for fuel design and core analysis. As a result, the nuclear fuel cycle costs for a typical LWR have been reduced during the past decades by about US$ 35 million per year. The estimated impact of further burnup increases on the fuel cycle costs is expected to be an additional saving of US$10 - 15 million per year. Due to the fact that the fuel will operate closer to design limits, a careful approach is required when introducing advanced fuel features in reload quantities. Trust and co-operation between the fuel vendors and the utilities is a prerequisite for the common success. (authors)

  15. Significant improvement of mouse cloning technique by treatment with trichostatin A after somatic nuclear transfer

    International Nuclear Information System (INIS)

    Kishigami, Satoshi; Mizutani, Eiji; Ohta, Hiroshi; Hikichi, Takafusa; Thuan, Nguyen Van; Wakayama, Sayaka; Bui, Hong-Thuy; Wakayama, Teruhiko

    2006-01-01

    The low success rate of animal cloning by somatic cell nuclear transfer (SCNT) is believed to be associated with epigenetic errors including abnormal DNA hypermethylation. Recently, we elucidated by using round spermatids that, after nuclear transfer, treatment of zygotes with trichostatin A (TSA), an inhibitor of histone deacetylase, can remarkably reduce abnormal DNA hypermethylation depending on the origins of transferred nuclei and their genomic regions [S. Kishigami, N. Van Thuan, T. Hikichi, H. Ohta, S. Wakayama. E. Mizutani, T. Wakayama, Epigenetic abnormalities of the mouse paternal zygotic genome associated with microinsemination of round spermatids, Dev. Biol. (2005) in press]. Here, we found that 5-50 nM TSA-treatment for 10 h following oocyte activation resulted in more efficient in vitro development of somatic cloned embryos to the blastocyst stage from 2- to 5-fold depending on the donor cells including tail tip cells, spleen cells, neural stem cells, and cumulus cells. This TSA-treatment also led to more than 5-fold increase in success rate of mouse cloning from cumulus cells without obvious abnormality but failed to improve ES cloning success. Further, we succeeded in establishment of nuclear transfer-embryonic stem (NT-ES) cells from TSA-treated cloned blastocyst at a rate three times higher than those from untreated cloned blastocysts. Thus, our data indicate that TSA-treatment after SCNT in mice can dramatically improve the practical application of current cloning techniques

  16. The Improvement of Screening the Significant Factors of Oil Blends as Bio lubricant Base Stock

    International Nuclear Information System (INIS)

    Noor Hajarul Ashikin Shamsuddin; Rozaini Abdullah; Zainab Hamzah; Siti Jamilah Hanim Mohd Yusof

    2015-01-01

    A new formulation bio lubricant base stock was developed by blending of waste cooking oil (WCO) with Jatropha curcas oil (JCO). The objective of this research is to evaluate significant factors contributing to the production of oil blends for bio lubricant application. The significant factors used in this study were oil ratio (WCO:JCO), agitation times (min) and agitation speed (rpm). The blended oil bio based lubricant was used to determine the saponification, acid, peroxide and iodine values. The experimental design used in this study was the 2 level-factorial design. In this experiment, it was found that the effect of oil ratio and interaction of oil ratio and agitation speed gave the most significant effect in oil blends as bio lubricant base stock. The highest ratio of oil blend 80 %:20 % WCO:JCO, with low agitation speed of 300 rpm and low agitation time of 30 minutes gave the optimum results. The acid, saponification, peroxide and iodine values obtained were 0.517±0.08 mg KOH/ g, 126.23±1.62 mg/ g, 7.5±2.0 m eq/ kg and 50.42±2.85 mg/ g respectively. A higher ratio of waste cooking oil blends was found to be favourable as bio lubricant base stock. (author)

  17. EASE-Grid 2.0: Incremental but Significant Improvements for Earth-Gridded Data Sets

    Directory of Open Access Journals (Sweden)

    Matthew H. Savoie

    2012-03-01

    Full Text Available Defined in the early 1990s for use with gridded satellite passive microwave data, the Equal-Area Scalable Earth Grid (EASE-Grid was quickly adopted and used for distribution of a variety of satellite and in situ data sets. Conceptually easy to understand, EASE-Grid suffers from limitations that make it impossible to format in the widely popular GeoTIFF convention without reprojection. Importing EASE-Grid data into standard mapping software packages is nontrivial and error-prone. This article defines a standard for an improved EASE-Grid 2.0 definition, addressing how the changes rectify issues with the original grid definition. Data distributed using the EASE-Grid 2.0 standard will be easier for users to import into standard software packages and will minimize common reprojection errors that users had encountered with the original EASE-Grid definition.

  18. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    Science.gov (United States)

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.

  19. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  20. A patient/family-centered strategic plan can drive significant improvement.

    Science.gov (United States)

    Brilli, Richard J; Crandall, Wallace V; Berry, Janet C; Stoverock, Linda; Rosen, Kerry; Budin, Lee; Kelleher, Kelly J; Gleeson, Sean P; Davis, J Terrance

    2014-08-01

    The use of a PFCSP, as a road map to operationalize the hospital's vision, has been a compelling paradigm to achieve significant QI results. The framework is simple yet directly aligns with the IOM domains of quality. It has inspired and helped actively engage hospital personnel in the work required to achieve the goals and vision of the hospital system. Five years after initiating this type of plan, activity is flourishing in each of the domains and midterm results are substantial. We think that the nature of this strategic plan has been an important aspect of our success to date.

  1. An integrated PRA module for fast determination of risk significance and improvement effectiveness

    International Nuclear Information System (INIS)

    Chao, Chun-Chang; Lin, Jyh-Der

    2004-01-01

    With the widely use of PRA technology in risk-informed applications, to predict the changes of CDF and LERF becomes a standard process for risk-informed applications. This paper describes an integrated PRA module prepared for risk-informed applications. The module contains a super risk engine, a super fault tree engine, an advanced PRA model and a tool for data base maintenance. The individual element of the module also works well for purpose other than risk-informed applications. The module has been verified and validated through a series of scrupulous benchmark tests with similar software. The results of the benchmark tests showed that the module has remarkable accuracy and speed even for an extremely large-size top-logic fault tree as well as for the case in which large amount of MCSs may be generated. The risk monitor for nuclear power plants in Taiwan is the first application to adopt the module. The results predicted by the risk monitor are now accepted by the regulatory agency. A tool to determine the risk significance according to the inspection findings will be the next application to adopt the module in the near future. This tool classified the risk significance into four different color codes according to the level of increase on CDF. Experience of application showed that the flexibility, the accuracy and speed of the module make it useful in any risk-informed applications when risk indexes must be determined by resolving a PRA model. (author)

  2. Inulin significantly improves serum magnesium levels in proton pump inhibitor-induced hypomagnesaemia.

    Science.gov (United States)

    Hess, M W; de Baaij, J H F; Broekman, M; Bisseling, T M; Haarhuis, B; Tan, A; Te Morsche, R; Hoenderop, J G J; Bindels, R J M; Drenth, J P H

    2016-06-01

    Proton pump inhibitors (PPI) are among the most widely prescribed drugs to treat gastric acid-related disorders. PPI-induced hypomagnesaemia, a defect in intestinal absorption of Mg(2+) , can be a severe side effect of chronic PPI use. To restore serum Mg(2+) concentrations in PPI-induced hypomagnesaemia patients by dietary supplementation with inulin fibres. Eleven patients with PPI-induced hypomagnesaemia and 10 controls were treated with inulin (20 g/day). Each trial consisted of two cycles of 14-day inulin treatment followed by a washout period of 14 days. Patients continued to use their PPI. Serum Mg(2+) levels served as the primary endpoint. Inulin significantly enhanced serum Mg(2+) levels from 0.60 to 0.68 mmol/L in PPI-induced hypomagnesaemia patients, and from 0.84 to 0.93 mmol/L in controls. As a consequence 24 h urinary Mg(2+) excretion was significantly increased in patients with PPI-induced hypomagnesaemia (0.3-2.2 mmol/day). Symptoms related to hypomagnesaemia, including muscle cramps and paraesthesia, were reduced during intervention with inulin. Inulin increases serum Mg(2+) concentrations under PPI maintenance in patients with PPI-induced hypomagnesaemia. © 2016 John Wiley & Sons Ltd.

  3. Progressive degradation of alloy 690 and the development of a significant improvement in alloy 800CR

    International Nuclear Information System (INIS)

    Staehle, Roger W.; Arioka, Koji; Tapping, Robert

    2015-01-01

    The present most widely used alloys for tubing in steam generators and structural materials in water cooled reactors are Alloy 690 and Alloy 800. However, both alloys, while improved over Alloy 600 may not meet the needs of longer range applications in the range of 80-100 years. Alloy 690 sustains damage resulting from the formation of cavities at grain boundaries which eventually cover about 50% of the area of the grain boundaries with the remainder covering being covered with carbides. The cavities seem to nucleate on the carbides leaving the grain boundaries a structure of cavities and carbides. Such a structure will lead the Alloy 690 to fail completely. Normal Alloy 800 does not produce such cavities and probably retains a large amount of its corrosion resistance but does sustain progressive SCC at low rate. A new alloy, 800CR, has been developed in a collaboration among Arioka, Tapping, and Staehle. This alloy is based on a Cr composition of 23.5-27% with the remainder retaining the previous Alloy 800 composition. 800CR sustains a crack velocity about 100 times less than Alloy 690 and a negligible rate of initiation. The 800CR, alloy is now seeking a patent. (authors)

  4. Significant improvements of electrical discharge machining performance by step-by-step updated adaptive control laws

    Science.gov (United States)

    Zhou, Ming; Wu, Jianyang; Xu, Xiaoyi; Mu, Xin; Dou, Yunping

    2018-02-01

    In order to obtain improved electrical discharge machining (EDM) performance, we have dedicated more than a decade to correcting one essential EDM defect, the weak stability of the machining, by developing adaptive control systems. The instabilities of machining are mainly caused by complicated disturbances in discharging. To counteract the effects from the disturbances on machining, we theoretically developed three control laws from minimum variance (MV) control law to minimum variance and pole placements coupled (MVPPC) control law and then to a two-step-ahead prediction (TP) control law. Based on real-time estimation of EDM process model parameters and measured ratio of arcing pulses which is also called gap state, electrode discharging cycle was directly and adaptively tuned so that a stable machining could be achieved. To this end, we not only theoretically provide three proved control laws for a developed EDM adaptive control system, but also practically proved the TP control law to be the best in dealing with machining instability and machining efficiency though the MVPPC control law provided much better EDM performance than the MV control law. It was also shown that the TP control law also provided a burn free machining.

  5. Analysis of data collected from right and left limbs: Accounting for dependence and improving statistical efficiency in musculoskeletal research.

    Science.gov (United States)

    Stewart, Sarah; Pearson, Janet; Rome, Keith; Dalbeth, Nicola; Vandal, Alain C

    2018-01-01

    Statistical techniques currently used in musculoskeletal research often inefficiently account for paired-limb measurements or the relationship between measurements taken from multiple regions within limbs. This study compared three commonly used analysis methods with a mixed-models approach that appropriately accounted for the association between limbs, regions, and trials and that utilised all information available from repeated trials. Four analysis were applied to an existing data set containing plantar pressure data, which was collected for seven masked regions on right and left feet, over three trials, across three participant groups. Methods 1-3 averaged data over trials and analysed right foot data (Method 1), data from a randomly selected foot (Method 2), and averaged right and left foot data (Method 3). Method 4 used all available data in a mixed-effects regression that accounted for repeated measures taken for each foot, foot region and trial. Confidence interval widths for the mean differences between groups for each foot region were used as a criterion for comparison of statistical efficiency. Mean differences in pressure between groups were similar across methods for each foot region, while the confidence interval widths were consistently smaller for Method 4. Method 4 also revealed significant between-group differences that were not detected by Methods 1-3. A mixed effects linear model approach generates improved efficiency and power by producing more precise estimates compared to alternative approaches that discard information in the process of accounting for paired-limb measurements. This approach is recommended in generating more clinically sound and statistically efficient research outputs. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Significant improvement of pig cloning efficiency by treatment with LBH589 after somatic cell nuclear transfer.

    Science.gov (United States)

    Jin, Jun-Xue; Li, Suo; Gao, Qing-Shan; Hong, Yu; Jin, Long; Zhu, Hai-Ying; Yan, Chang-Guo; Kang, Jin-Dan; Yin, Xi-Jun

    2013-10-01

    The low success rate of animal cloning by somatic cell nuclear transfer (SCNT) associates with epigenetic aberrancy, including the abnormal acetylation of histones. Altering the epigenetic status by histone deacetylase inhibitors (HDACi) enhances the developmental potential of SCNT embryos. In the current study, we examined the effects of LBH589 (panobinostat), a novel broad-spectrum HDACi, on the nuclear reprogramming and development of pig SCNT embryos in vitro. In experiment 1, we compared the in vitro developmental competence of nuclear transfer embryos treated with different concentrations of LBH589. Embryos treated with 50 nM LBH589 for 24 hours showed a significant increase in the rate of blastocyst formation compared with the control or embryos treated with 5 or 500 nM LBH589 (32.4% vs. 11.8%, 12.1%, and 10.0%, respectively, P < 0.05). In experiment 2, we examined the in vitro developmental competence of nuclear transfer embryos treated with 50 nM LBH589 for various intervals after activation and 6-dimethylaminopurine. Embryos treated for 24 hours had higher rates of blastocyst formation than the other groups. In experiment 3, when the acetylation of H4K12 was examined in SCNT embryos treated for 6 hours with 50 nM LBH589 by immunohistochemistry, the staining intensities of these proteins in LBH589-treated SCNT embryos were significantly higher than in the control. In experiment 4, LBH589-treated nuclear transfer and control embryos were transferred into surrogate mothers, resulting in three (100%) and two (66.7%) pregnancies, respectively. In conclusion, LBH589 enhances the nuclear reprogramming and developmental potential of SCNT embryos by altering the epigenetic status and expression, and increasing blastocyst quality. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  7. Statistical Redundancy Testing for Improved Gene Selection in Cancer Classification Using Microarray Data

    Directory of Open Access Journals (Sweden)

    J. Sunil Rao

    2007-01-01

    Full Text Available In gene selection for cancer classifi cation using microarray data, we define an eigenvalue-ratio statistic to measure a gene’s contribution to the joint discriminability when this gene is included into a set of genes. Based on this eigenvalueratio statistic, we define a novel hypothesis testing for gene statistical redundancy and propose two gene selection methods. Simulation studies illustrate the agreement between statistical redundancy testing and gene selection methods. Real data examples show the proposed gene selection methods can select a compact gene subset which can not only be used to build high quality cancer classifiers but also show biological relevance.

  8. Silver chlorobromide nanocubes with significantly improved uniformity: synthesis and assembly into photonic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zheng; Okasinski, John S.; Gosztola, David J.; Ren, Yang; Sun, Yugang

    2015-01-01

    Silver chlorobromide (AgClxBr1-x, 0 < x < 1) nanocubes with a highly uniform size, morphology, and crystallinity have been successfully synthesized through a co-precipitation of Ag+ ions with both Cl- and Br- ions in ethylene glycol containing polyvinyl pyrrolidone at mild temperatures. Compositions of the synthesized nanocubes can be easily tuned by controlling the molar ratio of Cl- to Br- ions in the reaction solutions. The size of the nanocubes is determined by varying a number of parameters including the molar ratio of Cl- to Br- ions, injection rate of Ag+ ions, and reaction temperature. The real-time formation of colloidal AgClxBr1-x nanocubes has been monitored, for the first time, by in situ highenergy synchrotron X-ray diffraction. The time-resolved results reveal that a fast injection rate of Ag+ ions is critical for the formation of AgClxBr1-x nanocubes with a highly pure face-centered cubic crystalline phase. The improved uniformity of the AgClxBr1-x nanocubes is beneficial for assembling them into order superlattices (e.g., photonic crystals) even by simply applying centrifugation forces. The stop band of the resulting photonic crystals can be easily tuned from the ultraviolet to the infrared region by using AgClxBr1-x nanocubes with different sizes. The variation of the dielectric constant of AgClxBr1-x associated with the change of the relative concentration of halide ions provides an additional knob to tune the optical properties of photonic crystals.

  9. Arthroscopic Debridement for Primary Degenerative Osteoarthritis of the Elbow Leads to Significant Improvement in Range of Motion and Clinical Outcomes: A Systematic Review.

    Science.gov (United States)

    Sochacki, Kyle R; Jack, Robert A; Hirase, Takashi; McCulloch, Patrick C; Lintner, David M; Liberman, Shari R; Harris, Joshua D

    2017-12-01

    The purpose of this investigation was to determine whether arthroscopic debridement of primary elbow osteoarthritis results in statistically significant and clinically relevant improvement in (1) elbow range of motion and (2) clinical outcomes with (3) low complication and reoperation rates. A systematic review was registered with PROSPERO and performed using PRISMA guidelines. Databases were searched for studies that investigated the outcomes of arthroscopic debridement for the treatment of primary osteoarthritis of the elbow in adult human patients. Study methodological quality was analyzed. Studies that included post-traumatic arthritis were excluded. Elbow motion and all elbow-specific patient-reported outcome scores were eligible for analysis. Comparisons between preoperative and postoperative values from each study were made using 2-sample Z-tests (http://in-silico.net/tools/statistics/ztest) using a P value osteoarthritis results in statistically significant and clinically relevant improvement in elbow range of motion and clinical outcomes with low complication and reoperation rates. Systematic review of level IV studies. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  10. The Strasbourg Large Refractor and Dome: Significant Improvements and Failed Attempts

    Science.gov (United States)

    Heck, Andre

    2009-01-01

    Founded by the German Empire in the late 19th century, Strasbourg Astronomical Observatory featured several novelties from the start. According to Mueller (1978), the separation of observing buildings from the study area and from the astronomers' residence was a revolution in observatory construction. The instruments were, as much as possible, isolated from the vibrations of the buildings themselves. "Gas flames" and water were used to reduce temperature effects. Thus the Large Dome (ca 11m diameter), housing the Large Refractor (ca 49cm, then the largest in Germany) and covered by zinc over wood, could be cooled down by water running from the top. Reports (including by the French who took over the observatory after World War I) are however somehow nonexistent on the effective usage and actual efficiency of such a system (which must have generated locally a significant amount of humidity). The paper will detail these technical attempts as well as the specificities of the instruments installed in that new observatory intended as a showcase of German astronomy.

  11. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    International Nuclear Information System (INIS)

    Dordevic, N.; Wehrens, R.; Postma, G.J.; Buydens, L.M.C.; Camin, F.

    2012-01-01

    Highlights: ► The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. ► The official method is based on univariate statistical tests of H, C and O isotopic ratios. ► We consider 5220 Italian wine samples collected in the period 2000–2010. ► Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. ► In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000–2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  12. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    Energy Technology Data Exchange (ETDEWEB)

    Dordevic, N.; Wehrens, R. [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy); Postma, G.J.; Buydens, L.M.C. [Radboud University Nijmegen, Institute for Molecules and Materials, Analytical Chemistry, P.O. Box 9010, 6500 GL Nijmegen (Netherlands); Camin, F., E-mail: federica.camin@fmach.it [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy)

    2012-12-13

    Highlights: Black-Right-Pointing-Pointer The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. Black-Right-Pointing-Pointer The official method is based on univariate statistical tests of H, C and O isotopic ratios. Black-Right-Pointing-Pointer We consider 5220 Italian wine samples collected in the period 2000-2010. Black-Right-Pointing-Pointer Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. Black-Right-Pointing-Pointer In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000-2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  13. Single-atom catalysts for CO2 electroreduction with significant activity and selectivity improvements.

    Science.gov (United States)

    Back, Seoin; Lim, Juhyung; Kim, Na-Young; Kim, Yong-Hyun; Jung, Yousung

    2017-02-01

    A single-atom catalyst (SAC) has an electronic structure that is very different from its bulk counterparts, and has shown an unexpectedly high specific activity with a significant reduction in noble metal usage for CO oxidation, fuel cell and hydrogen evolution applications, although physical origins of such performance enhancements are still poorly understood. Herein, by means of density functional theory (DFT) calculations, we for the first time investigate the great potential of single atom catalysts for CO 2 electroreduction applications. In particular, we study a single transition metal atom anchored on defective graphene with single or double vacancies, denoted M@sv-Gr or M@dv-Gr, where M = Ag, Au, Co, Cu, Fe, Ir, Ni, Os, Pd, Pt, Rh or Ru, as a CO 2 reduction catalyst. Many SACs are indeed shown to be highly selective for the CO 2 reduction reaction over a competitive H 2 evolution reaction due to favorable adsorption of carboxyl (*COOH) or formate (*OCHO) over hydrogen (*H) on the catalysts. On the basis of free energy profiles, we identified several promising candidate materials for different products; Ni@dv-Gr (limiting potential U L = -0.41 V) and Pt@dv-Gr (-0.27 V) for CH 3 OH production, and Os@dv-Gr (-0.52 V) and Ru@dv-Gr (-0.52 V) for CH 4 production. In particular, the Pt@dv-Gr catalyst shows remarkable reduction in the limiting potential for CH 3 OH production compared to any existing catalysts, synthesized or predicted. To understand the origin of the activity enhancement of SACs, we find that the lack of an atomic ensemble for adsorbate binding and the unique electronic structure of the single atom catalysts as well as orbital interaction play an important role, contributing to binding energies of SACs that deviate considerably from the conventional scaling relation of bulk transition metals.

  14. Factors Related to Significant Improvement of Estimated Glomerular Filtration Rates in Chronic Hepatitis B Patients Receiving Telbivudine Therapy

    Directory of Open Access Journals (Sweden)

    Te-Fu Lin

    2017-01-01

    Full Text Available Background and Aim. The improvement of estimated glomerular filtration rates (eGFRs in chronic hepatitis B (CHB patients receiving telbivudine therapy is well known. The aim of this study was to clarify the kinetics of eGFRs and to identify the significant factors related to the improvement of eGFRs in telbivudine-treated CHB patients in a real-world setting. Methods. Serial eGFRs were calculated every 3 months using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI equation. The patients were classified as CKD-1, -2, or -3 according to a baseline eGFR of ≥90, 60–89, or <60 mL/min/1.73 m2, respectively. A significant improvement of eGFR was defined as a more than 10% increase from the baseline. Results. A total of 129 patients were enrolled, of whom 36% had significantly improved eGFRs. According to a multivariate analysis, diabetes mellitus (DM (p=0.028 and CKD-3 (p=0.043 were both significantly related to such improvement. The rates of significant improvement of eGFR were about 73% and 77% in patients with DM and CKD-3, respectively. Conclusions. Telbivudine is an alternative drug of choice for the treatment of hepatitis B patients for whom renal safety is a concern, especially patients with DM and CKD-3.

  15. "Dear Fresher …"--How Online Questionnaires Can Improve Learning and Teaching Statistics

    Science.gov (United States)

    Bebermeier, Sarah; Nussbeck, Fridtjof W.; Ontrup, Greta

    2015-01-01

    Lecturers teaching statistics are faced with several challenges supporting students' learning in appropriate ways. A variety of methods and tools exist to facilitate students' learning on statistics courses. The online questionnaires presented in this report are a new, slightly different computer-based tool: the central aim was to support students…

  16. Do Introductory Statistics Courses in the United States Improve Students' Attitudes?

    Science.gov (United States)

    Schau, Candace; Emmioglu, Esma

    2012-01-01

    We examined the attitudes of about 2200 students enrolled in 101 sections of post-secondary introductory statistics service courses located across the United States. Using the "Survey of Attitudes Toward Statistics-36," we assessed students' attitudes when they entered and left their courses, as well as changes in attitudes across their courses.…

  17. Analysis of Norwegian bio energy statistics. Quality improvement proposals; Analyse av norsk bioenergistatistikk. Forslag til kvalitetsheving

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This report is an assessment of the current model and presentation form of bio energy statistics. It appears proposed revision and enhancement of both collection and data representation. In the context of market development both in general for energy and particularly for bio energy and government targets, a good bio energy statistics form the basis to follow up the objectives and means.(eb)

  18. Improving Transparency and Replication in Bayesian Statistics : The WAMBS-Checklist

    NARCIS (Netherlands)

    Depaoli, Sarah; van de Schoot, Rens

    2017-01-01

    Bayesian statistical methods are slowly creeping into all fields of science and are becoming ever more popular in applied research. Although it is very attractive to use Bayesian statistics, our personal experience has led us to believe that naively applying Bayesian methods can be dangerous for at

  19. Improving Statistics Education through Simulations: The Case of the Sampling Distribution.

    Science.gov (United States)

    Earley, Mark A.

    This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…

  20. Omega-3 fatty acid therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation, however, did not significantly improve insulin sensitivity in patients with hypertriglyceridemia.

    Science.gov (United States)

    Oh, Pyung Chun; Koh, Kwang Kon; Sakuma, Ichiro; Lim, Soo; Lee, Yonghee; Lee, Seungik; Lee, Kyounghoon; Han, Seung Hwan; Shin, Eak Kyun

    2014-10-20

    Experimental studies demonstrate that higher intake of omega-3 fatty acids (n-3 FA) improves insulin sensitivity, however, we reported that n-3 FA 2g therapy, most commonly used dosage did not significantly improve insulin sensitivity despite reducing triglycerides by 21% in patients. Therefore, we investigated the effects of different dosages of n-3 FA in patients with hypertriglyceridemia. This was a randomized, single-blind, placebo-controlled, parallel study. Age, sex, and body mass index were matched among groups. All patients were recommended to maintain a low fat diet. Forty-four patients (about 18 had metabolic syndrome/type 2 diabetes mellitus) in each group were given placebo, n-3 FA 1 (O1), 2 (O2), or 4 g (O4), respectively daily for 2 months. n-3 FA therapy dose-dependently and significantly decreased triglycerides and triglycerides/HDL cholesterol and improved flow-mediated dilation, compared with placebo (by ANOVA). However, each n-3 FA therapy did not significantly decrease high-sensitivity C-reactive protein and fibrinogen, compared with placebo. O1 significantly increased insulin levels and decreased insulin sensitivity (determined by QUICKI) and O2 significantly decreased plasma adiponectin levels relative to baseline measurements. Of note, when compared with placebo, each n-3 FA therapy did not significantly change insulin, glucose, adiponectin, glycated hemoglobin levels and insulin sensitivity (by ANOVA). We observed similar results in a subgroup of patients with the metabolic syndrome. n-3 FA therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation. Nonetheless, n-3 FA therapy did not significantly improve acute-phase reactants and insulin sensitivity in patients with hypertriglyceridemia, regardless of dosages. Copyright © 2014. Published by Elsevier Ireland Ltd.

  1. Improved parameterization of managed grassland in a global process-based vegetation model using Bayesian statistics

    Science.gov (United States)

    Rolinski, S.; Müller, C.; Lotze-Campen, H.; Bondeau, A.

    2010-12-01

    More than a quarter of the Earth’s land surface is covered by grassland, which is also the major part (~ 70 %) of the agricultural area. Most of this area is used for livestock production in different degrees of intensity. The dynamic global vegetation model LPJmL (Sitch et al., Global Change Biology, 2003; Bondeau et al., Global Change Biology, 2007) is one of few process-based model that simulates biomass production on managed grasslands at the global scale. The implementation of managed grasslands and its evaluation has received little attention so far, as reference data on grassland productivity are scarce and the definition of grassland extent and usage are highly uncertain. However, grassland productivity is related to large areas, and strongly influences global estimates of carbon and water budgets and should thus be improved. Plants are implemented in LPJmL in an aggregated form as plant functional types assuming that processes concerning carbon and water fluxes are quite similar between species of the same type. Therefore, the parameterization of a functional type is possible with parameters in a physiologically meaningful range of values. The actual choice of the parameter values from the possible and reasonable phase space should satisfy the condition of the best fit of model results and measured data. In order to improve the parameterization of managed grass we follow a combined procedure using model output and measured data of carbon and water fluxes. By comparing carbon and water fluxes simultaneously, we expect well-balanced refinements and avoid over-tuning of the model in only one direction. The comparison of annual biomass from grassland to data from the Food and Agriculture Organization of the United Nations (FAO) per country provide an overview about the order of magnitude and the identification of deviations. The comparison of daily net primary productivity, soil respiration and water fluxes at specific sites (FluxNet Data) provides

  2. Improving statistical accounting as an element of quality management development in healthcare

    Directory of Open Access Journals (Sweden)

    Gulnara R. Khamidullina

    2014-01-01

    Full Text Available Objective to offer a solution to the problem of statistical accounting in health care on the basis of stationary medical institutions which will serve as the basis for solving many of the challenges facing the management of health institutions. Method systematic logical analysis was used. Scientific novelty solution to the issues of collecting recording and using statistical indicators necessary for the medical institution functioning is proposed. The possibility is proved of using this approach for solving problems of healthcare management. Practical value the use of research results for the processing of statistical information will reduce the risk of losing information. It will allow health care managers to use both statistical information and data on availability and consumption of materials thus ensuring the timeliness and correctness of managerial decisions. Results the problem of collecting processing and recording statistical information in a stationary medical institution is discussed. On the basis of the conducted research a solution is proposed to the problem of errors in the collecting processing transferring and keeping statistical information in health care. The necessity of these implementations is proved. The economic efficiency is proved which is associated with the timely managerial decisions on the basis of the offered variant of statistical accounting.

  3. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  4. Quantifying the statistical importance of utilizing regression over classic energy intensity calculations for tracking efficiency improvements in industry

    Energy Technology Data Exchange (ETDEWEB)

    Nimbalkar, Sachin U. [ORNL; Wenning, Thomas J. [ORNL; Guo, Wei [ORNL

    2017-08-01

    In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero, which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.

  5. Improved radiograph measurement inter-observer reliability by use of statistical shape models

    Energy Technology Data Exchange (ETDEWEB)

    Pegg, E.C., E-mail: elise.pegg@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Mellon, S.J., E-mail: stephen.mellon@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Salmon, G. [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Alvand, A., E-mail: abtin.alvand@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Pandit, H., E-mail: hemant.pandit@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Murray, D.W., E-mail: david.murray@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom); Gill, H.S., E-mail: richie.gill@ndorms.ox.ac.uk [University of Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, Nuffield Orthopaedic Centre, Windmill Road, Oxford OX3 7LD (United Kingdom)

    2012-10-15

    Pre- and post-operative radiographs of patients undergoing joint arthroplasty are often examined for a variety of purposes including preoperative planning and patient assessment. This work examines the feasibility of using active shape models (ASM) to semi-automate measurements from post-operative radiographs for the specific case of the Oxford™ Unicompartmental Knee. Measurements of the proximal tibia and the position of the tibial tray were made using the ASM model and manually. Data were obtained by four observers and one observer took four sets of measurements to allow assessment of the inter- and intra-observer reliability, respectively. The parameters measured were the tibial tray angle, the tray overhang, the tray size, the sagittal cut position, the resection level and the tibial width. Results demonstrated improved reliability (average of 27% and 11.2% increase for intra- and inter-reliability, respectively) and equivalent accuracy (p > 0.05 for compared data values) for all of the measurements using the ASM model, with the exception of the tray overhang (p = 0.0001). Less time (15 s) was required to take measurements using the ASM model compared with manual measurements, which was significant. These encouraging results indicate that semi-automated measurement techniques could improve the reliability of radiographic measurements.

  6. Improved radiograph measurement inter-observer reliability by use of statistical shape models

    International Nuclear Information System (INIS)

    Pegg, E.C.; Mellon, S.J.; Salmon, G.; Alvand, A.; Pandit, H.; Murray, D.W.; Gill, H.S.

    2012-01-01

    Pre- and post-operative radiographs of patients undergoing joint arthroplasty are often examined for a variety of purposes including preoperative planning and patient assessment. This work examines the feasibility of using active shape models (ASM) to semi-automate measurements from post-operative radiographs for the specific case of the Oxford™ Unicompartmental Knee. Measurements of the proximal tibia and the position of the tibial tray were made using the ASM model and manually. Data were obtained by four observers and one observer took four sets of measurements to allow assessment of the inter- and intra-observer reliability, respectively. The parameters measured were the tibial tray angle, the tray overhang, the tray size, the sagittal cut position, the resection level and the tibial width. Results demonstrated improved reliability (average of 27% and 11.2% increase for intra- and inter-reliability, respectively) and equivalent accuracy (p > 0.05 for compared data values) for all of the measurements using the ASM model, with the exception of the tray overhang (p = 0.0001). Less time (15 s) was required to take measurements using the ASM model compared with manual measurements, which was significant. These encouraging results indicate that semi-automated measurement techniques could improve the reliability of radiographic measurements

  7. Detection by voxel-wise statistical analysis of significant changes in regional cerebral glucose uptake in an APP/PS1 transgenic mouse model of Alzheimer's disease.

    Science.gov (United States)

    Dubois, Albertine; Hérard, Anne-Sophie; Delatour, Benoît; Hantraye, Philippe; Bonvento, Gilles; Dhenain, Marc; Delzescaux, Thierry

    2010-06-01

    Biomarkers and technologies similar to those used in humans are essential for the follow-up of Alzheimer's disease (AD) animal models, particularly for the clarification of mechanisms and the screening and validation of new candidate treatments. In humans, changes in brain metabolism can be detected by 1-deoxy-2-[(18)F] fluoro-D-glucose PET (FDG-PET) and assessed in a user-independent manner with dedicated software, such as Statistical Parametric Mapping (SPM). FDG-PET can be carried out in small animals, but its resolution is low as compared to the size of rodent brain structures. In mouse models of AD, changes in cerebral glucose utilization are usually detected by [(14)C]-2-deoxyglucose (2DG) autoradiography, but this requires prior manual outlining of regions of interest (ROI) on selected sections. Here, we evaluate the feasibility of applying the SPM method to 3D autoradiographic data sets mapping brain metabolic activity in a transgenic mouse model of AD. We report the preliminary results obtained with 4 APP/PS1 (64+/-1 weeks) and 3 PS1 (65+/-2 weeks) mice. We also describe new procedures for the acquisition and use of "blockface" photographs and provide the first demonstration of their value for the 3D reconstruction and spatial normalization of post mortem mouse brain volumes. Despite this limited sample size, our results appear to be meaningful, consistent, and more comprehensive than findings from previously published studies based on conventional ROI-based methods. The establishment of statistical significance at the voxel level, rather than with a user-defined ROI, makes it possible to detect more reliably subtle differences in geometrically complex regions, such as the hippocampus. Our approach is generic and could be easily applied to other biomarkers and extended to other species and applications. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  9. Strategies for improving utilization of computerized statistical data by the social science community.

    OpenAIRE

    Robbin, Alice

    1981-01-01

    In recent decades there has been a notable expansion of statistical data produced by the public and private sectors for administrative, research, policy and evaluation programs. This is due to advances in relatively inexpensive and efficient data collection and management of computer-readable statistical data. Corresponding changes have not occurred in the management of data collection, preservation, description and dissemination. As a result, the process by which data become accessible to so...

  10. An improved method for statistical analysis of raw accelerator mass spectrometry data

    International Nuclear Information System (INIS)

    Gutjahr, A.; Phillips, F.; Kubik, P.W.; Elmore, D.

    1987-01-01

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs

  11. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  12. An improved mixing model providing joint statistics of scalar and scalar dissipation

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Daniel W. [Department of Energy Resources Engineering, Stanford University, Stanford, CA (United States); Jenny, Patrick [Institute of Fluid Dynamics, ETH Zurich (Switzerland)

    2008-11-15

    For the calculation of nonpremixed turbulent flames with thin reaction zones the joint probability density function (PDF) of the mixture fraction and its dissipation rate plays an important role. The corresponding PDF transport equation involves a mixing model for the closure of the molecular mixing term. Here, the parameterized scalar profile (PSP) mixing model is extended to provide the required joint statistics. Model predictions are validated using direct numerical simulation (DNS) data of a passive scalar mixing in a statistically homogeneous turbulent flow. Comparisons between the DNS and the model predictions are provided, which involve different initial scalar-field lengthscales. (author)

  13. Research Pearls: The Significance of Statistics and Perils of Pooling. Part 3: Pearls and Pitfalls of Meta-analyses and Systematic Reviews.

    Science.gov (United States)

    Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman

    2017-08-01

    Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations

  14. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  15. Improving Usage Statistics Processing for a Library Consortium: The Virtual Library of Virginia's Experience

    Science.gov (United States)

    Matthews, Tansy E.

    2009-01-01

    This article describes the development of the Virtual Library of Virginia (VIVA). The VIVA statistics-processing system remains a work in progress. Member libraries will benefit from the ability to obtain the actual data from the VIVA site, rather than just the summaries, so a project to make these data available is currently being planned. The…

  16. Development of LED Light Sources for Improved Visualization of Veins: a statistical approach

    DEFF Research Database (Denmark)

    Argyraki, Aikaterini; Clemmensen, Line Katrine Harder; Petersen, Paul Michael

    The present statistical study investigates the difference of diffuse reflectances between skin and vein (defined as contrast indicator) under different visible wavelengths of a population of 39 adult participants. The purpose of the study is to examine if there is a group of wavelengths-color com...

  17. Localization Improvement in Wireless Sensor Networks Using a New Statistical Channel Model

    DEFF Research Database (Denmark)

    Karimi Alavijeh, Amir; Ramezani, Hossein; Karimi Alavijeh, Ali

    2018-01-01

    of this statistical relationship, we have investigated the localization problem of a hidden node using extended Kalman filter (EKF). Compared to the conventional EKF in which the covariance matrix of measurement noise is fixed, this matrix can be updated online using the proposed model. The experimental...

  18. Natalizumab Significantly Improves Cognitive Impairment over Three Years in MS: Pattern of Disability Progression and Preliminary MRI Findings.

    Directory of Open Access Journals (Sweden)

    Flavia Mattioli

    Full Text Available Previous studies reported that Multiple Sclerosis (MS patients treated with natalizumab for one or two years exhibit a significant reduction in relapse rate and in cognitive impairment, but the long term effects on cognitive performance are unknown. This study aimed to evaluate the effects of natalizumab on cognitive impairment in a cohort of 24 consecutive patients with relapsing remitting MS treated for 3 years. The neuropsychological tests, as well as relapse number and EDSS, were assessed at baseline and yearly for three years. The impact on cortical atrophy was also considered in a subgroup of them, and are thus to be considered as preliminary. Results showed a significant reduction in the number of impaired neuropsychological tests after three years, a significant decrease in annualized relapse rate at each time points compared to baseline and a stable EDSS. In the neuropsychological assessment, a significant improvement in memory, attention and executive function test scores was detected. Preliminary MRI data show that, while GM volume did not change at 3 years, a significantly greater parahippocampal and prefrontal gray matter density was noticed, the former correlating with neuropsychological improvement in a memory test. This study showed that therapy with Natalizumab is helpful in improving cognitive performance, and is likely to have a protective role on grey matter, over a three years follow-up.

  19. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  20. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  1. Improvement of statistical methods for detecting anomalies in climate and environmental monitoring systems

    Science.gov (United States)

    Yakunin, A. G.; Hussein, H. M.

    2018-01-01

    The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.

  2. Statistics that learn: can logistic discriminant analysis improve diagnosis in brain SPECT?

    International Nuclear Information System (INIS)

    Behin-Ain, S.; Barnden, L.; Kwiatek, R.; Del Fante, P.; Casse, R.; Burnet, R.; Chew, G.; Kitchener, M.; Boundy, K.; Unger, S.

    2002-01-01

    Full text: Logistic discriminant analysis (LDA) is a statistical technique capable of discriminating individuals within a diseased group against normals. It also enables classification of various diseases within a group of patients. This technique provides a quantitative, automated and non-subjective clinical diagnostic tool. Based on a population known to have the disease and a normal control group, an algorithm was developed and trained to identify regions in the human brain responsible for the disease in question. The algorithm outputs a statistical map representing diseased or normal probability on a voxel or cluster basis from which an index is generated for each subject. The algorithm also generates a set of coefficients which is used to generate an index for the purpose of classification of new subjects. The results are comparable and complement those of Statistical Parametric Mapping (SPM) which employs a more common linear discriminant technique. The results are presented for brain SPECT studies of two diseases: chronic fatigue syndrome (CFS) and fibromyalgia (FM). A 100% specificity and 94% sensitivity is achieved for the CFS study (similar to SPM results) and for the FM study 82% specificity and 94% sensitivity is achieved with corresponding SPM results showing 90% specificity and 82% sensitivity. The results encourages application of LDA for discrimination of new single subjects as well as of diseased and normal groups. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  3. Significant clinical improvement in radiation-induced lumbosacral poly-radiculopathy by a treatment combining pentoxifylline, tocopherol, and clodronate (Pentoclo)

    Energy Technology Data Exchange (ETDEWEB)

    Delanian, S. [Hop St Louis, Serv Oncol Radiotherapie, APHP, F-75010 Paris, (France); Lefaix, J.L. [CEA-LARIA, CIRIL-GANIL, Caen, (France); Maisonobe, T. [Hop La Pitie Salpetriere, Federat Neurophysiol Clin, APHP, Paris, (France)

    2008-07-01

    Radiation-induced (RI) peripheral neuropathy is a rare and severe delayed complication of radiotherapy that is spontaneously irreversible, with no standard of treatment. We previously developed a successful antioxidant treatment in RI fibrosis and necrosis. Two patients with progressive worsening RI lumbosacral poly-radiculopathy experienced over several years a significant clinical improvement in their neurological sensorimotor symptoms with long-term pentoxifylline-tocopherol-clodronate treatment, and good safety. (authors)

  4. The cumulative effect of small dietary changes may significantly improve nutritional intakes in free-living children and adults

    OpenAIRE

    Bornet , Francis; Paineau , Damien; Beaufils , François; Boulier , Alain; Cassuto , Dominique-Adèle; Chwalow , Judith; Combris , Pierre; Couet , Charles; Jouret , Béatrice; Lafay , Lionel; Laville , Martine; Mahé , Sylvain; Ricour , Claude; Romon , Monique; Simon , Chantal

    2010-01-01

    Abstract Background/Objectives: The ELPAS study was an 8-month randomized controlled dietary modification trial designed to test the hypothesis that family dietary coaching would improve nutritional intakes and weight control in 2026 free-living children and parents (Paineau et al., 2008). It resulted in significant nutritional changes, with beneficial effects on body mass index in adults. In these ancillary analyses, we investigated dietary changes throughout the intervention. ...

  5. Using continuous time stochastic modelling and nonparametric statistics to improve the quality of first principles models

    DEFF Research Database (Denmark)

    A methodology is presented that combines modelling based on first principles and data based modelling into a modelling cycle that facilitates fast decision-making based on statistical methods. A strong feature of this methodology is that given a first principles model along with process data......, the corresponding modelling cycle model of the given system for a given purpose. A computer-aided tool, which integrates the elements of the modelling cycle, is also presented, and an example is given of modelling a fed-batch bioreactor....

  6. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    International Nuclear Information System (INIS)

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-01-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  7. Improvement of Information and Methodical Provision of Macro-economic Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Tiurina Dina M.

    2014-02-01

    Full Text Available The article generalises and analyses main shortcomings of the modern system of macro-statistical analysis based on the use of the system of national accounts and balance of the national economy. The article proves on the basis of historic analysis of formation of indicators of the system of national accounts that problems with its practical use have both regional and global reasons. In order to eliminate impossibility of accounting life quality the article offers a system of quality indicators based on the general perception of wellbeing as assurance in own solvency of population and representative sampling of economic subjects.

  8. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    To improve the level skill of Global Climate Models (GCMs) and Regional Climate Models (RCMs) in reproducing the statistics of rainfall at a basin level and at hydrologically relevant temporal scales (e.g. daily), two types of statistical approaches have been suggested. One is the statistical correction of climate model rainfall outputs using historical series of precipitation. The other is the use of stochastic models of rainfall to conditionally simulate precipitation series, based on large-scale atmospheric predictors produced by climate models (e.g. geopotential height, relative vorticity, divergence, mean sea level pressure). The latter approach, usually referred to as statistical rainfall downscaling, aims at reproducing the statistical character of rainfall, while accounting for the effects of large-scale atmospheric circulation (and, therefore, climate forcing) on rainfall statistics. While promising, statistical rainfall downscaling has not attracted much attention in recent years, since the suggested approaches involved complex (i.e. subjective or computationally intense) identification procedures of the local weather, in addition to demonstrating limited success in reproducing several statistical features of rainfall, such as seasonal variations, the distributions of dry and wet spell lengths, the distribution of the mean rainfall intensity inside wet periods, and the distribution of rainfall extremes. In an effort to remedy those shortcomings, Langousis and Kaleris (2014) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables, which accurately reproduces the statistical character of rainfall at multiple time-scales. Here, we study the relative performance of: a) quantile-quantile (Q-Q) correction of climate model rainfall products, and b) the statistical downscaling scheme of Langousis and Kaleris (2014), in reproducing the statistical structure of rainfall, as well as rainfall extremes, at a

  9. Acid or erythromycin stress significantly improves transformation efficiency through regulating expression of DNA binding proteins in Lactococcus lactis F44.

    Science.gov (United States)

    Wang, Binbin; Zhang, Huawei; Liang, Dongmei; Hao, Panlong; Li, Yanni; Qiao, Jianjun

    2017-12-01

    Lactococcus lactis is a gram-positive bacterium used extensively in the dairy industry and food fermentation, and its biological characteristics are usually improved through genetic manipulation. However, poor transformation efficiency was the main restriction factor for the construction of engineered strains. In this study, the transformation efficiency of L. lactis F44 showed a 56.1-fold increase in acid condition (pH 5.0); meanwhile, erythromycin stress (0.04 μg/mL) promoted the transformation efficiency more significantly (76.9-fold). Notably, the transformation efficiency of F44e (L. lactis F44 harboring empty pLEB124) increased up to 149.1-fold under the synergistic stresses of acid and erythromycin. In addition, the gene expression of some DNA binding proteins (DprA, RadA, RadC, RecA, RecQ, and SsbA) changed correspondingly. Especially for radA, 25.1-fold improvement was detected when F44e was exposed to pH 5.0. Overexpression of some DNA binding proteins could improve the transformation efficiency. The results suggested that acid or erythromycin stress could improve the transformation efficiency of L. lactis through regulating gene expression of DNA binding proteins. We have proposed a simple but promising strategy for improving the transformation efficiency of L. lactis and other hard-transformed microorganisms. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective

  11. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  12. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio

    Science.gov (United States)

    Walter, Donald A.; Starn, J. Jeffrey

    2013-01-01

    significantly at the local and basin scale. The scale and distribution of prediction differences can be explained by the underlying differences in the estimated variables and the relative weight of the variables in the statistical models. Differences in predictions of exceeding 1 mg/L as N, which only includes environmental variables, generally correlated with the underlying differences in STATSGO data, whereas differences in exceeding 4 mg/L as N were more spatially extensive because that model included environmental and nitrogen-source variables. Using depths to water from within circular buffers derived from the spatial data and depths to water within the circular buffers calculated from the groundwater-flow model, restricted to the same range, resulted in large differences in predicted probabilities. The differences in estimated explanatory variables between contributing recharge areas and circular buffers indicate incorporation of physically based contributing recharge area likely would result in a different set of explanatory variables and an improved set of statistical models. The use of a groundwater-flow model to improve representations of source areas or to provide more-detailed estimates of specific explanatory variables includes a number of limitations and technical considerations. An assumption in these analyses is that (1) there is a state of mass balance between recharge and pumping, and (2) transport to a pumped well is under a steady state flow field. Comparison of volumeequivalent contributing recharge areas under steady-state and transient transport conditions at a location in the southeastern part of the basin shows the steady-state contributing recharge area is a reasonable approximation of the transient contributing recharge area after between 10 and 20 years of pumping. The first assumption is a more important consideration for this analysis. A gradient effect refers to a condition where simulated pumping from a well is less than recharge through the

  13. An initiative to improve the management of clinically significant test results in a large health care network.

    Science.gov (United States)

    Roy, Christopher L; Rothschild, Jeffrey M; Dighe, Anand S; Schiff, Gordon D; Graydon-Baker, Erin; Lenoci-Edwards, Jennifer; Dwyer, Cheryl; Khorasani, Ramin; Gandhi, Tejal K

    2013-11-01

    The failure of providers to communicate and follow up clinically significant test results (CSTR) is an important threat to patient safety. The Massachusetts Coalition for the Prevention of Medical Errors has endorsed the creation of systems to ensure that results can be received and acknowledged. In 2008 a task force was convened that represented clinicians, laboratories, radiology, patient safety, risk management, and information systems in a large health care network with the goals of providing recommendations and a road map for improvement in the management of CSTR and of implementing this improvement plan during the sub-force sequent five years. In drafting its charter, the task broadened the scope from "critical" results to "clinically significant" ones; clinically significant was defined as any result that requires further clinical action to avoid morbidity or mortality, regardless of the urgency of that action. The task force recommended four key areas for improvement--(1) standardization of policies and definitions, (2) robust identification of the patient's care team, (3) enhanced results management/tracking systems, and (4) centralized quality reporting and metrics. The task force faced many challenges in implementing these recommendations, including disagreements on definitions of CSTR and on who should have responsibility for CSTR, changes to established work flows, limitations of resources and of existing information systems, and definition of metrics. This large-scale effort to improve the communication and follow-up of CSTR in a health care network continues with ongoing work to address implementation challenges, refine policies, prepare for a new clinical information system platform, and identify new ways to measure the extent of this important safety problem.

  14. The SGLT2 Inhibitor Dapagliflozin Significantly Improves the Peripheral Microvascular Endothelial Function in Patients with Uncontrolled Type 2 Diabetes Mellitus.

    Science.gov (United States)

    Sugiyama, Seigo; Jinnouchi, Hideaki; Kurinami, Noboru; Hieshima, Kunio; Yoshida, Akira; Jinnouchi, Katsunori; Nishimura, Hiroyuki; Suzuki, Tomoko; Miyamoto, Fumio; Kajiwara, Keizo; Jinnouchi, Tomio

    2018-03-30

    Objective Sodium-glucose cotransporter-2 (SGLT2) inhibitors reduce cardiovascular events and decrease the body fat mass in patients with type 2 diabetes mellitus (T2DM). We examined whether or not the SGLT2-inhibitor dapagliflozin can improve the endothelial function associated with a reduction in abdominal fat mass. Methods We prospectively recruited patients with uncontrolled (hemoglobin A1c [HbA1c] >7.0%) T2DM who were not being treated by SGLT2 inhibitors. Patients were treated with add-on dapagliflozin (5 mg/day) or non-SGLT2 inhibitor medicines for 6 months to improve their HbA1c. We measured the peripheral microvascular endothelial function as assessed by reactive hyperemia peripheral arterial tonometry (RH-PAT) and calculated the natural logarithmic transformed value of the RH-PAT index (LnRHI). We then investigated changes in the LnRHI and abdominal fat area using computed tomography (CT). Results The subjects were 54 patients with uncontrolled T2DM (72.2% men) with a mean HbA1c of 8.1%. The HbA1c was significantly decreased in both groups, with no significant difference between the groups. Dapagliflozin treatment, but not non-SGLT2 inhibitor treatment, significantly increased the LnRHI. The changes in the LnRHI were significantly greater in the dapagliflozin group than in the non-SGLT2 inhibitor group. Dapagliflozin treatment, but not non-SGLT2 inhibitor treatment, significantly decreased the abdominal visceral fat area, subcutaneous fat area (SFA), and total fat area (TFA) as assessed by CT and significantly increased the plasma adiponectin levels. The percentage changes in the LnRHI were significantly correlated with changes in the SFA, TFA, systolic blood pressure, and adiponectin. Conclusion Add-on treatment with dapagliflozin significantly improves the glycemic control and endothelial function associated with a reduction in the abdominal fat mass in patients with uncontrolled T2DM.

  15. Principal Components of Superhigh-Dimensional Statistical Features and Support Vector Machine for Improving Identification Accuracies of Different Gear Crack Levels under Different Working Conditions

    Directory of Open Access Journals (Sweden)

    Dong Wang

    2015-01-01

    Full Text Available Gears are widely used in gearbox to transmit power from one shaft to another. Gear crack is one of the most frequent gear fault modes found in industry. Identification of different gear crack levels is beneficial in preventing any unexpected machine breakdown and reducing economic loss because gear crack leads to gear tooth breakage. In this paper, an intelligent fault diagnosis method for identification of different gear crack levels under different working conditions is proposed. First, superhigh-dimensional statistical features are extracted from continuous wavelet transform at different scales. The number of the statistical features extracted by using the proposed method is 920 so that the extracted statistical features are superhigh dimensional. To reduce the dimensionality of the extracted statistical features and generate new significant low-dimensional statistical features, a simple and effective method called principal component analysis is used. To further improve identification accuracies of different gear crack levels under different working conditions, support vector machine is employed. Three experiments are investigated to show the superiority of the proposed method. Comparisons with other existing gear crack level identification methods are conducted. The results show that the proposed method has the highest identification accuracies among all existing methods.

  16. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    International Nuclear Information System (INIS)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  17. Spontaneous Resolution of Long-Standing Macular Detachment due to Optic Disc Pit with Significant Visual Improvement.

    Science.gov (United States)

    Parikakis, Efstratios A; Chatziralli, Irini P; Peponis, Vasileios G; Karagiannis, Dimitrios; Stratos, Aimilianos; Tsiotra, Vasileia A; Mitropoulos, Panagiotis G

    2014-01-01

    To report a case of spontaneous resolution of a long-standing serous macular detachment associated with an optic disc pit, leading to significant visual improvement. A 63-year-old female presented with a 6-month history of blurred vision and micropsia in her left eye. Her best-corrected visual acuity was 6/24 in the left eye, and fundoscopy revealed serous macular detachment associated with optic disc pit, which was confirmed by optical coherence tomography (OCT). The patient was offered vitrectomy as a treatment alternative, but she preferred to be reviewed conservatively. Three years after initial presentation, neither macular detachment nor subretinal fluid was evident in OCT, while the inner segment/outer segment (IS/OS) junction line was intact. Her visual acuity was improved from 6/24 to 6/12 in her left eye, remaining stable at the 6-month follow-up after resolution. We present a case of spontaneous resolution of a long-standing macular detachment associated with an optic disc pit with significant visual improvement, postulating that the integrity of the IS/OS junction line may be a prognostic factor for final visual acuity and suggesting OCT as an indicator of visual prognosis and the probable necessity of a surgical management.

  18. Spontaneous Resolution ofLong-Standing Macular Detachment due to Optic Disc Pit with Significant Visual Improvement

    Directory of Open Access Journals (Sweden)

    Efstratios A. Parikakis

    2014-03-01

    Full Text Available Purpose: To report a case of spontaneous resolution of a long-standing serous macular detachment associated with an optic disc pit, leading to significant visual improvement. Case Presentation: A 63-year-old female presented with a 6-month history of blurred vision and micropsia in her left eye. Her best-corrected visual acuity was 6/24 in the left eye, and fundoscopy revealed serous macular detachment associated with optic disc pit, which was confirmed by optical coherence tomography (OCT. The patient was offered vitrectomy as a treatment alternative, but she preferred to be reviewed conservatively. Three years after initial presentation, neither macular detachment nor subretinal fluid was evident in OCT, while the inner segment/outer segment (IS/OS junction line was intact. Her visual acuity was improved from 6/24 to 6/12 in her left eye, remaining stable at the 6-month follow-up after resolution. Conclusion: We present a case of spontaneous resolution of a long-standing macular detachment associated with an optic disc pit with significant visual improvement, postulating that the integrity of the IS/OS junction line may be a prognostic factor for final visual acuity and suggesting OCT as an indicator of visual prognosis and the probable necessity of a surgical management.

  19. Setting the stage for recovery : Improving veteran PTSD treatment effectiveness using statistical prediction

    NARCIS (Netherlands)

    Haagen, J.F.G.

    2017-01-01

    Over half a million Dutch veterans participated in almost a hundred war and peace keeping missions since 1940. During past deployments, veterans promoted peace and stability in conflict situations, endeavoured to win the hearts and minds of local communities and improved the lives of those affected

  20. Statistical and numerical methods to improve the transient divided bar method

    DEFF Research Database (Denmark)

    Bording, Thue Sylvester; Nielsen, S.B.; Balling, N.

    The divided bar method is a commonly used method to measure thermal conductivity of rock samples in laboratory. We present improvements to this method that allows for simultaneous measurements of both thermal conductivity and thermal diffusivity. The divided bar setup is run in a transient mode...

  1. Trace saver: A tool for network service improvement and personalised analysis of user centric statistics

    Science.gov (United States)

    Bilal, Muhammad; Asfand-e-Yar, Mockford, Steve; Khan, Wasiq; Awan, Irfan

    2012-11-01

    Mobile technology is among the fastest growing technologies in today's world with low cost and highly effective benefits. Most important and entertaining areas in mobile technology development and usage are location based services, user friendly networked applications and gaming applications. However, concern towards network operator service provision and improvement has been very low. The portable applications available for a range of mobile operating systems which help improve the network operator services are desirable by the mobile operators. This paper proposes a state of the art mobile application Tracesaver, which provides a great achievement over the barriers in gathering device and network related information, for network operators to improve their network service provision. Tracesaver is available for a broad range of mobile devices with different mobile operating systems and computational capabilities. The availability of Tracesaver in market has proliferated over the last year since it was published. The survey and results show that Tracesaver is being used by millions of mobile users and provides novel ways of network service improvement with its highly user friendly interface.

  2. A study to determine whether targeted education significantly improves the perception of human torture in medical students in India.

    Science.gov (United States)

    Husain, Munawwar; Ghaffar, Usama B; Usmani, Jawed Ahmad; Rivzi, Shameem Jahan

    2010-08-01

    This study was undertaken to find out the knowledge of torture in MBBS students. A fair comparison was done by selecting two groups of medical students; one, to whom torture was not taught ie, pretaught group (PrTG, n = 125), and second, to whom torture was taught in classroom ie, post-taught group (PoTG, n = 110) in more than one sessions. The topic on torture was taught under many headings maximising the effort to cover as much as possible; namely, definition, geographical distribution, types of torture (physical, psychological and sexual), post-torture sequelae, sociopolitical environment prevailing in the country, doctors' involvement in torture, rehabilitation of torture victims and the UNO's role in containment of torture. In all a questionnaire was designed having MCQ types on these aspects. It was found that significant level of difference in perception and knowledge about torture existed amongst the groups, and this was further accentuated in medical and non-medical intratopics. 'P' value of each question was computed separately. It was found that the study was statistically significant and reestablished the need of fortifying the gossameric firmament of education specific to torture.

  3. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  4. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  5. An Improved Statistical Point-source Foreground Model for the Epoch of Reionization

    Energy Technology Data Exchange (ETDEWEB)

    Murray, S. G.; Trott, C. M.; Jordan, C. H. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia)

    2017-08-10

    We present a sophisticated statistical point-source foreground model for low-frequency radio Epoch of Reionization (EoR) experiments using the 21 cm neutral hydrogen emission line. Motivated by our understanding of the low-frequency radio sky, we enhance the realism of two model components compared with existing models: the source count distributions as a function of flux density and spatial position (source clustering), extending current formalisms for the foreground covariance of 2D power-spectral modes in 21 cm EoR experiments. The former we generalize to an arbitrarily broken power law, and the latter to an arbitrary isotropically correlated field. This paper presents expressions for the modified covariance under these extensions, and shows that for a more realistic source spatial distribution, extra covariance arises in the EoR window that was previously unaccounted for. Failure to include this contribution can yield bias in the final power-spectrum and under-estimate uncertainties, potentially leading to a false detection of signal. The extent of this effect is uncertain, owing to ignorance of physical model parameters, but we show that it is dependent on the relative abundance of faint sources, to the effect that our extension will become more important for future deep surveys. Finally, we show that under some parameter choices, ignoring source clustering can lead to false detections on large scales, due to both the induced bias and an artificial reduction in the estimated measurement uncertainty.

  6. An Improved Statistical Point-source Foreground Model for the Epoch of Reionization

    Science.gov (United States)

    Murray, S. G.; Trott, C. M.; Jordan, C. H.

    2017-08-01

    We present a sophisticated statistical point-source foreground model for low-frequency radio Epoch of Reionization (EoR) experiments using the 21 cm neutral hydrogen emission line. Motivated by our understanding of the low-frequency radio sky, we enhance the realism of two model components compared with existing models: the source count distributions as a function of flux density and spatial position (source clustering), extending current formalisms for the foreground covariance of 2D power-spectral modes in 21 cm EoR experiments. The former we generalize to an arbitrarily broken power law, and the latter to an arbitrary isotropically correlated field. This paper presents expressions for the modified covariance under these extensions, and shows that for a more realistic source spatial distribution, extra covariance arises in the EoR window that was previously unaccounted for. Failure to include this contribution can yield bias in the final power-spectrum and under-estimate uncertainties, potentially leading to a false detection of signal. The extent of this effect is uncertain, owing to ignorance of physical model parameters, but we show that it is dependent on the relative abundance of faint sources, to the effect that our extension will become more important for future deep surveys. Finally, we show that under some parameter choices, ignoring source clustering can lead to false detections on large scales, due to both the induced bias and an artificial reduction in the estimated measurement uncertainty.

  7. IMPROVING KNITTED FABRICS BY A STATISTICAL CONTROL OF DIMENSIONAL CHANGES AFTER THE DYEING PROCESS

    Directory of Open Access Journals (Sweden)

    LLINARES-BERENGUER Jorge

    2017-05-01

    Full Text Available One of the most important problems that cotton knitted fabrics present during the manufacturing process is their dimensional instability, which needs to be minimised. Some of the variables that intervene in fabric shrinkage are related with its structural characteristics, use of fiber when producing yarn, the yarn count used or the dyeing process employed. Conducted under real factory conditions, the present study attempted to model the behaviour of a fabric structure after a dyeing process by contributing several algorithms that calculate dyed fabric stability after the first wash cycle. Small-diameter circular machines are used to produce garments with no side seams. This is the reason why a list of machines that produce the same fabrics for different widths needs to be made available to produce all the sizes of a given garment. Two relaxation states were distingued for interlock fabric: dyed and dry relaxation, and dyed and wash relaxation. The linear density of the yarn employed to produce sample fabric was combed cotton Ne 30. The machines used for optic bleaching were Overflow. To obtain knitting structures with optimum dimensional stability, different statistical tools were used to help us to evaluate all the production process variables (raw material, machines and process responsible for this variation. This allowed to guarantee product quality without creating costs and losses.

  8. Computer-Based Learning: The Use of SPSS Statistical Program for Improving Biostatistical Competence of Medical Students

    Directory of Open Access Journals (Sweden)

    Zvi H. Perry

    2014-01-01

    Full Text Available Background. We changed the biostatistics curriculum for our medical students and have created a course entitled “Multivariate analysis of statistical data, using the SPSS package.” Purposes. The aim of this course was to develop students’ skills in computerized data analysis, as well as enhancing their ability to read and interpret statistical data analysis in the literature. Methods. In the current study we have shown that a computer-based course for biostatistics and advanced data analysis is feasible and efficient, using course specific evaluation questionnaires. Results. Its efficacy is both subjective (our subjects felt better prepared to do their theses, as well as to read articles with advanced statistical data analysis and objective (their knowledge of how and when to apply statistical procedures seemed to improve. Conclusions. We showed that a formal evaluative process for such a course is possible and that it enhances the learning experience both for the students and their teachers. In the current study we have shown that a computer-based course for biostatistics and advanced data analysis is feasible and efficient.

  9. Combining large model ensembles with extreme value statistics to improve attribution statements of rare events

    Directory of Open Access Journals (Sweden)

    Sebastian Sippel

    2015-09-01

    In conclusion, our study shows that EVT and empirical estimates based on numerical simulations can indeed be used to productively inform each other, for instance to derive appropriate EVT parameters for short observational time series. Further, the combination of ensemble simulations with EVT allows us to significantly reduce the number of simulations needed for statements about the tails.

  10. Estimation of background noise level on seismic station using statistical analysis for improved analysis accuracy

    Science.gov (United States)

    Han, S. M.; Hahm, I.

    2015-12-01

    We evaluated the background noise level of seismic stations in order to collect the observation data of high quality and produce accurate seismic information. Determining of the background noise level was used PSD (Power Spectral Density) method by McNamara and Buland (2004) in this study. This method that used long-term data is influenced by not only innate electronic noise of sensor and a pulse wave resulting from stabilizing but also missing data and controlled by the specified frequency which is affected by the irregular signals without site characteristics. It is hard and inefficient to implement process that filters out the abnormal signal within the automated system. To solve these problems, we devised a method for extracting the data which normally distributed with 90 to 99% confidence intervals at each period. The availability of the method was verified using 62-seismic stations with broadband and short-period sensors operated by the KMA (Korea Meteorological Administration). Evaluation standards were NHNM (New High Noise Model) and NLNM (New Low Noise Model) published by the USGS (United States Geological Survey). It was designed based on the western United States. However, Korean Peninsula surrounded by the ocean on three sides has a complicated geological structure and a high population density. So, we re-designed an appropriate model in Korean peninsula by statistically combined result. The important feature is that secondary-microseism peak appeared at a higher frequency band. Acknowledgements: This research was carried out as a part of "Research for the Meteorological and Earthquake Observation Technology and Its Application" supported by the 2015 National Institute of Meteorological Research (NIMR) in the Korea Meteorological Administration.

  11. Statistical improvement in detection level of gravitational microlensing events from their light curves

    Science.gov (United States)

    Ibrahim, Ichsan; Malasan, Hakim L.; Kunjaya, Chatief; Timur Jaelani, Anton; Puannandra Putri, Gerhana; Djamal, Mitra

    2018-04-01

    In astronomy, the brightness of a source is typically expressed in terms of magnitude. Conventionally, the magnitude is defined by the logarithm of received flux. This relationship is known as the Pogson formula. For received flux with a small signal to noise ratio (S/N), however, the formula gives a large magnitude error. We investigate whether the use of Inverse Hyperbolic Sine function (hereafter referred to as the Asinh magnitude) in the modified formulae could allow for an alternative calculation of magnitudes for small S/N flux, and whether the new approach is better for representing the brightness of that region. We study the possibility of increasing the detection level of gravitational microlensing using 40 selected microlensing light curves from the 2013 and 2014 seasons and by using the Asinh magnitude. Photometric data of the selected events are obtained from the Optical Gravitational Lensing Experiment (OGLE). We found that utilization of the Asinh magnitude makes the events brighter compared to using the logarithmic magnitude, with an average of about 3.42 × 10‑2 magnitude and an average in the difference of error between the logarithmic and the Asinh magnitude of about 2.21 × 10‑2 magnitude. The microlensing events OB140847 and OB140885 are found to have the largest difference values among the selected events. Using a Gaussian fit to find the peak for OB140847 and OB140885, we conclude statistically that the Asinh magnitude gives better mean squared values of the regression and narrower residual histograms than the Pogson magnitude. Based on these results, we also attempt to propose a limit in magnitude value for which use of the Asinh magnitude is optimal with small S/N data.

  12. Performance of the Sellick maneuver significantly improves when residents and trained nurses use a visually interactive guidance device in simulation

    International Nuclear Information System (INIS)

    Connor, Christopher W; Saffary, Roya; Feliz, Eddy

    2013-01-01

    We examined the proper performance of the Sellick maneuver, a maneuver used to reduce the risk of aspiration of stomach contents during induction of general anesthesia, using a novel device that measures and visualizes the force applied to the cricoid cartilage using thin-film force sensitive resistors in a form suitable for in vivo use. Performance was tested in three stages with twenty anaesthesiology residents and twenty trained operating room nurses. Firstly, subjects applied force to the cricoid cartilage as was customary to them. Secondly, subjects used the device to guide the application of that force. Thirdly, subjects were again asked to perform the manoeuvre without visual guidance. Each test lasted 1 min and the amount of force applied was measured throughout. Overall, the Sellick maneuver was often not applied properly, with large variance between individual subjects. Performance and inter-subject consistency improved to a very highly significant degree when subjects were able to use the device as a visual guide (p < 0.001). Subsequent significant improvements in performances during the last, unguided test demonstrated that the device initiated learning. (paper)

  13. Performance of the Sellick maneuver significantly improves when residents and trained nurses use a visually interactive guidance device in simulation

    Energy Technology Data Exchange (ETDEWEB)

    Connor, Christopher W; Saffary, Roya; Feliz, Eddy [Department of Anesthesiology Boston Medical Center, Boston, MA (United States)

    2013-12-15

    We examined the proper performance of the Sellick maneuver, a maneuver used to reduce the risk of aspiration of stomach contents during induction of general anesthesia, using a novel device that measures and visualizes the force applied to the cricoid cartilage using thin-film force sensitive resistors in a form suitable for in vivo use. Performance was tested in three stages with twenty anaesthesiology residents and twenty trained operating room nurses. Firstly, subjects applied force to the cricoid cartilage as was customary to them. Secondly, subjects used the device to guide the application of that force. Thirdly, subjects were again asked to perform the manoeuvre without visual guidance. Each test lasted 1 min and the amount of force applied was measured throughout. Overall, the Sellick maneuver was often not applied properly, with large variance between individual subjects. Performance and inter-subject consistency improved to a very highly significant degree when subjects were able to use the device as a visual guide (p < 0.001). Subsequent significant improvements in performances during the last, unguided test demonstrated that the device initiated learning. (paper)

  14. Statistical behavior and geological significance of the geochemical distribution of trace elements in the Cretaceous volcanics Cordoba and San Luis, Argentina

    International Nuclear Information System (INIS)

    Daziano, C.

    2010-01-01

    Statistical analysis of trace elements in volcanics research s, allowed to distinguish two independent populations with the same geochemical environment. For each component they have variable index of homogeneity resulting in dissimilar average values that reveal geochemical intra telluric phenomena. On the other hand the inhomogeneities observed in these rocks - as reflected in its petrochemical characters - could be exacerbated especially at so remote and dispersed location of their pitches, their relations with the enclosing rocks for the ranges of compositional variation, due differences relative ages

  15. Combination of blood tests for significant fibrosis and cirrhosis improves the assessment of liver-prognosis in chronic hepatitis C.

    Science.gov (United States)

    Boursier, J; Brochard, C; Bertrais, S; Michalak, S; Gallois, Y; Fouchard-Hubert, I; Oberti, F; Rousselet, M-C; Calès, P

    2014-07-01

    Recent longitudinal studies have emphasised the prognostic value of noninvasive tests of liver fibrosis and cross-sectional studies have shown their combination significantly improves diagnostic accuracy. To compare the prognostic accuracy of six blood fibrosis tests and liver biopsy, and evaluate if test combination improves the liver-prognosis assessment in chronic hepatitis C (CHC). A total of 373 patients with compensated CHC, liver biopsy (Metavir F) and blood tests targeting fibrosis (APRI, FIB4, Fibrotest, Hepascore, FibroMeter) or cirrhosis (CirrhoMeter) were included. Significant liver-related events (SLRE) and liver-related deaths were recorded during follow-up (started the day of biopsy). During the median follow-up of 9.5 years (3508 person-years), 47 patients had a SLRE and 23 patients died from liver-related causes. For the prediction of first SLRE, most blood tests allowed higher prognostication than Metavir F [Harrell C-index: 0.811 (95% CI: 0.751-0.868)] with a significant increase for FIB4: 0.879 [0.832-0.919] (P = 0.002), FibroMeter: 0.870 [0.812-0.922] (P = 0.005) and APRI: 0.861 [0.813-0.902] (P = 0.039). Multivariate analysis identified FibroMeter, CirrhoMeter and sustained viral response as independent predictors of first SLRE. CirrhoMeter was the only independent predictor of liver-related death. The combination of FibroMeter and CirrhoMeter classifications into a new FM/CM classification improved the liver-prognosis assessment compared to Metavir F staging or single tests by identifying five subgroups of patients with significantly different prognoses. Some blood fibrosis tests are more accurate than liver biopsy for determining liver prognosis in CHC. A new combination of two complementary blood tests, one targeted for fibrosis and the other for cirrhosis, optimises assessment of liver-prognosis. © 2014 John Wiley & Sons Ltd.

  16. An improved hazard rate twisting approach for the statistic of the sum of subexponential variates

    KAUST Repository

    Rached, Nadhir B.; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    In this letter, we present an improved hazard rate twisting technique for the estimation of the probability that a sum of independent but not necessarily identically distributed subexponential Random Variables (RVs) exceeds a given threshold. Instead of twisting all the components in the summation, we propose to twist only the RVs which have the biggest impact on the right-tail of the sum distribution and keep the other RVs unchanged. A minmax approach is performed to determine the optimal twisting parameter which leads to an asymptotic optimality criterion. Moreover, we show through some selected simulation results that our proposed approach results in a variance reduction compared to the technique where all the components are twisted.

  17. Hyperlink-Embedded Journal Articles Improve Statistical Knowledge and Reader Satisfaction

    Science.gov (United States)

    Saxon, David; Pearson, Alexander T.; Wu, Peter

    2015-01-01

    Background To practice evidence-based medicine, physicians should have a solid understanding of fundamental epidemiological and biostatistical concepts. Research suggests that only a minority of physicians have such an understanding of biostatistics. Objective To collect pilot data on a novel biostatistical educational tool, a hyperlink-embedded journal article, which is aimed at improving knowledge in biostatistics. Methods Forty-four physicians-in-training participated in this pilot study. Participants completed a pretest consisting of 5 questions about biostatistical terms that would be encountered in the article. They were randomized to either an unmodified journal article or to the same article with hyperlinked biostatistical terms. All participants then completed a posttest that was identical to the pretest. Results Having access to hyperlinked information had a positive association with the number of improved test answers (P = .05). Use of hyperlinks varied, and were seemingly dependent on user comfort with terms; well-understood definitions (“average”) were clicked on a few times (5.5% of participants), whereas more obscure method terms (“Lexis diagram”) were clicked on by 94% of participants. While only 42% of participants stated they would have looked up definitions of the biostatistical terms if they had not been provided in the hyperlinked article, 94% of participants identified the hyperlink tool as something they would use if readily available to them when reading journal articles. Conclusions Results of this pilot study of a novel educational intervention suggest that embedded hyperlinks within journal articles may be a useful tool to teach biostatistical terms to physicians. PMID:26692981

  18. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  19. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    International Nuclear Information System (INIS)

    Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-01-01

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  20. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    Science.gov (United States)

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  1. Improved Use of Small Reference Panels for Conditional and Joint Analysis with GWAS Summary Statistics.

    Science.gov (United States)

    Deng, Yangqing; Pan, Wei

    2018-06-01

    Due to issues of practicality and confidentiality of genomic data sharing on a large scale, typically only meta- or mega-analyzed genome-wide association study (GWAS) summary data, not individual-level data, are publicly available. Reanalyses of such GWAS summary data for a wide range of applications have become more and more common and useful, which often require the use of an external reference panel with individual-level genotypic data to infer linkage disequilibrium (LD) among genetic variants. However, with a small sample size in only hundreds, as for the most popular 1000 Genomes Project European sample, estimation errors for LD are not negligible, leading to often dramatically increased numbers of false positives in subsequent analyses of GWAS summary data. To alleviate the problem in the context of association testing for a group of SNPs, we propose an alternative estimator of the covariance matrix with an idea similar to multiple imputation. We use numerical examples based on both simulated and real data to demonstrate the severe problem with the use of the 1000 Genomes Project reference panels, and the improved performance of our new approach. Copyright © 2018 by the Genetics Society of America.

  2. Codon Optimization Significantly Improves the Expression Level of α-Amylase Gene from Bacillus licheniformis in Pichia pastoris

    Directory of Open Access Journals (Sweden)

    Jian-Rong Wang

    2015-01-01

    Full Text Available α-Amylase as an important industrial enzyme has been widely used in starch processing, detergent, and paper industries. To improve expression efficiency of recombinant α-amylase from Bacillus licheniformis (B. licheniformis, the α-amylase gene from B. licheniformis was optimized according to the codon usage of Pichia pastoris (P. pastoris and expressed in P. pastoris. Totally, the codons encoding 305 amino acids were optimized in which a total of 328 nucleotides were changed and the G+C content was increased from 47.6 to 49.2%. The recombinants were cultured in 96-deep-well microplates and screened by a new plate assay method. Compared with the wild-type gene, the optimized gene is expressed at a significantly higher level in P. pastoris after methanol induction for 168 h in 5- and 50-L bioreactor with the maximum activity of 8100 and 11000 U/mL, which was 2.31- and 2.62-fold higher than that by wild-type gene. The improved expression level makes the enzyme a good candidate for α-amylase production in industrial use.

  3. Spirulina Supplements Improved the Nutritional Status of Undernourished Children Quickly and Significantly: Experience from Kisantu, the Democratic Republic of the Congo

    Directory of Open Access Journals (Sweden)

    Féfé Khuabi Matondo

    2016-01-01

    Full Text Available Aim. Despite high levels of malnutrition, there is still very little information on the nutritional benefits of Spirulina, a natural alga that provides essential amino acids, rare essential lipids, and numerous minerals and vitamins, to undernourished children in the world. Methods. We carried out a prospective study of 50 children aged between six and 60 months. The intervention group consisted of 16 children who received 10 g of Spirulina daily, as well as the local diet administered by the nutritional centre, and the control group of 34 children who just received the local diet. Both groups of children were assessed on day zero, day 15, and day 30. Results. After treatment, the weight-for-age Z scores and weight-for-height Z scores increased significantly in the intervention group. At day 15, there was a statistically significant difference between the mean corpuscular volume, total proteins, and albumin (p<0.05 in both groups, in favour of the intervention group, and at day 30, this difference extended to all of the studied parameters (p<0.05. Conclusion. This study found that the nutritional status of undernourished children who received Spirulina supplements as well as the local diet administered by the nutritional centre improved quickly and significantly.

  4. Improvement of Measurement and Evaluation of Regional Authorities Activity: Model and Statistical Approach

    Directory of Open Access Journals (Sweden)

    Petrova Elena Аleksandrovna

    2014-11-01

    Full Text Available Formation of strategy of long-term social and economic development is a basis for effective functioning of executive authorities and the assessment of its efficiency in general. Modern theories of assessment of public administration productivity are guided by the process approach when it is expedient to carry out the formation of business processes of regional executive authorities according to strategic indicators of territorial development. In this regard, there is a problem of modeling of interrelation of indicators of social and economic development of the region and quantitative indices of results of business processes of executive authorities. At the first stage of modeling, two main directions of strategic development, namely innovative and investment activity of regional economic systems are considered. In this regard, the work presents the results of modeling the interrelation between the indicators of regional social and economic development and innovative and investment activity. Therefore, for carrying out the analysis, the social and economic system of the region is presented in space of the main indicators of social and economic development of the territory and indicators of innovative and investment activity. The analysis is made on values of the indicators calculated for regions of the Russian Federation during 2000, 2005, 2008, 2010 and 2011. It was revealed that strategic indicators of innovative and investment activity have the most significant impact on key signs of social and economic development.

  5. Improved statistical signal detection in pharmacovigilance by combining multiple strength-of-evidence aspects in vigiRank.

    Science.gov (United States)

    Caster, Ola; Juhlin, Kristina; Watson, Sarah; Norén, G Niklas

    2014-08-01

    Detection of unknown risks with marketed medicines is key to securing the optimal care of individual patients and to reducing the societal burden from adverse drug reactions. Large collections of individual case reports remain the primary source of information and require effective analytics to guide clinical assessors towards likely drug safety signals. Disproportionality analysis is based solely on aggregate numbers of reports and naively disregards report quality and content. However, these latter features are the very fundament of the ensuing clinical assessment. Our objective was to develop and evaluate a data-driven screening algorithm for emerging drug safety signals that accounts for report quality and content. vigiRank is a predictive model for emerging safety signals, here implemented with shrinkage logistic regression to identify predictive variables and estimate their respective contributions. The variables considered for inclusion capture different aspects of strength of evidence, including quality and clinical content of individual reports, as well as trends in time and geographic spread. A reference set of 264 positive controls (historical safety signals from 2003 to 2007) and 5,280 negative controls (pairs of drugs and adverse events not listed in the Summary of Product Characteristics of that drug in 2012) was used for model fitting and evaluation; the latter used fivefold cross-validation to protect against over-fitting. All analyses were performed on a reconstructed version of VigiBase(®) as of 31 December 2004, at around which time most safety signals in our reference set were emerging. The following aspects of strength of evidence were selected for inclusion into vigiRank: the numbers of informative and recent reports, respectively; disproportional reporting; the number of reports with free-text descriptions of the case; and the geographic spread of reporting. vigiRank offered a statistically significant improvement in area under the receiver

  6. Complete Au@ZnO core-shell nanoparticles with enhanced plasmonic absorption enabling significantly improved photocatalysis

    Science.gov (United States)

    Sun, Yiqiang; Sun, Yugang; Zhang, Tao; Chen, Guozhu; Zhang, Fengshou; Liu, Dilong; Cai, Weiping; Li, Yue; Yang, Xianfeng; Li, Cuncheng

    2016-05-01

    Nanostructured ZnO exhibits high chemical stability and unique optical properties, representing a promising candidate among photocatalysts in the field of environmental remediation and solar energy conversion. However, ZnO only absorbs the UV light, which accounts for less than 5% of total solar irradiation, significantly limiting its applications. In this article, we report a facile and efficient approach to overcome the poor wettability between ZnO and Au by carefully modulating the surface charge density on Au nanoparticles (NPs), enabling rapid synthesis of Au@ZnO core-shell NPs at room temperature. The resulting Au@ZnO core-shell NPs exhibit a significantly enhanced plasmonic absorption in the visible range due to the Au NP cores. They also show a significantly improved photocatalytic performance in comparison with their single-component counterparts, i.e., the Au NPs and ZnO NPs. Moreover, the high catalytic activity of the as-synthesized Au@ZnO core-shell NPs can be maintained even after many cycles of photocatalytic reaction. Our results shed light on the fact that the Au@ZnO core-shell NPs represent a promising class of candidates for applications in plasmonics, surface-enhanced spectroscopy, light harvest devices, solar energy conversion, and degradation of organic pollutants.Nanostructured ZnO exhibits high chemical stability and unique optical properties, representing a promising candidate among photocatalysts in the field of environmental remediation and solar energy conversion. However, ZnO only absorbs the UV light, which accounts for less than 5% of total solar irradiation, significantly limiting its applications. In this article, we report a facile and efficient approach to overcome the poor wettability between ZnO and Au by carefully modulating the surface charge density on Au nanoparticles (NPs), enabling rapid synthesis of Au@ZnO core-shell NPs at room temperature. The resulting Au@ZnO core-shell NPs exhibit a significantly enhanced plasmonic

  7. Statistical comparison of leaching behavior of incineration bottom ash using seawater and deionized water: Significant findings based on several leaching methods.

    Science.gov (United States)

    Yin, Ke; Dou, Xiaomin; Ren, Fei; Chan, Wei-Ping; Chang, Victor Wei-Chung

    2018-02-15

    Bottom ashes generated from municipal solid waste incineration have gained increasing popularity as alternative construction materials, however, they contains elevated heavy metals posing a challenge for its free usage. Different leaching methods are developed to quantify leaching potential of incineration bottom ashes meanwhile guide its environmentally friendly application. Yet, there are diverse IBA applications while the in situ environment is always complicated, challenging its legislation. In this study, leaching tests were conveyed using batch and column leaching methods with seawater as opposed to deionized water, to unveil the metal leaching potential of IBA subjected to salty environment, which is commonly encountered when using IBA in land reclamation yet not well understood. Statistical analysis for different leaching methods suggested disparate performance between seawater and deionized water primarily ascribed to ionic strength. Impacts of leachant are metal-specific dependent on leaching methods and have a function of intrinsic characteristics of incineration bottom ashes. Leaching performances were further compared on additional perspectives, e.g. leaching approach and liquid to solid ratio, indicating sophisticated leaching potentials dominated by combined geochemistry. It is necessary to develop application-oriented leaching methods with corresponding leaching criteria to preclude discriminations between different applications, e.g., terrestrial applications vs. land reclamation. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    Science.gov (United States)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  9. Combat-related intradural gunshot wound to the thoracic spine: significant improvement and neurologic recovery following bullet removal.

    Science.gov (United States)

    Louwes, Thijs M; Ward, William H; Lee, Kendall H; Freedman, Brett A

    2015-02-01

    The vast majority of combat-related penetrating spinal injuries from gunshot wounds result in severe or complete neurological deficit. Treatment is based on neurological status, the presence of cerebrospinal fluid (CSF) fistulas, and local effects of any retained fragment(s). We present a case of a 46-year-old male who sustained a spinal gunshot injury from a 7.62-mm AK-47 round that became lodged within the subarachnoid space at T9-T10. He immediately suffered complete motor and sensory loss. By 24-48 hours post-injury, he had recovered lower extremity motor function fully but continued to have severe sensory loss (posterior cord syndrome). On post-injury day 2, he was evacuated from the combat theater and underwent a T9 laminectomy, extraction of the bullet, and dural laceration repair. At surgery, the traumatic durotomy was widened and the bullet, which was laying on the dorsal surface of the spinal cord, was removed. The dura was closed in a water-tight fashion and fibrin glue was applied. Postoperatively, the patient made a significant but incomplete neurological recovery. His stocking-pattern numbness and sub-umbilical searing dysthesia improved. The spinal canal was clear of the foreign body and he had no persistent CSF leak. Postoperative magnetic resonance imaging of the spine revealed contusion of the spinal cord at the T9 level. Early removal of an intra-canicular bullet in the setting of an incomplete spinal cord injury can lead to significant neurological recovery following even high-velocity and/or high-caliber gunshot wounds. However, this case does not speak to, and prior experience does not demonstrate, significant neurological benefit in the setting of a complete injury.

  10. Prostate health index (phi) and prostate cancer antigen 3 (PCA3) significantly improve diagnostic accuracy in patients undergoing prostate biopsy.

    Science.gov (United States)

    Perdonà, Sisto; Bruzzese, Dario; Ferro, Matteo; Autorino, Riccardo; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; Longo, Michele; Spinelli, Rosa; Di Lorenzo, Giuseppe; Oliva, Andrea; De Sio, Marco; Damiano, Rocco; Altieri, Vincenzo; Terracciano, Daniela

    2013-02-15

    Prostate health index (phi) and prostate cancer antigen 3 (PCA3) have been recently proposed as novel biomarkers for prostate cancer (PCa). We assessed the diagnostic performance of these biomarkers, alone or in combination, in men undergoing first prostate biopsy for suspicion of PCa. One hundred sixty male subjects were enrolled in this prospective observational study. PSA molecular forms, phi index (Beckman coulter immunoassay), PCA3 score (Progensa PCA3 assay), and other established biomarkers (tPSA, fPSA, and %fPSA) were assessed before patients underwent a 18-core first prostate biopsy. The discriminating ability between PCa-negative and PCa-positive biopsies of Beckman coulter phi and PCA3 score and other used biomarkers were determined. One hundred sixty patients met inclusion criteria. %p2PSA (p2PSA/fPSA × 100), phi and PCA3 were significantly higher in patients with PCa compared to PCa-negative group (median values: 1.92 vs. 1.55, 49.97 vs. 36.84, and 50 vs. 32, respectively, P ≤ 0.001). ROC curve analysis showed that %p2PSA, phi, and PCA3 are good indicator of malignancy (AUCs = 0.68, 0.71, and 0.66, respectively). A multivariable logistic regression model consisting of both the phi index and PCA3 score allowed to reach an overall diagnostic accuracy of 0.77. Decision curve analysis revealed that this "combined" marker achieved the highest net benefit over the examined range of the threshold probability. phi and PCA3 showed no significant difference in the ability to predict PCa diagnosis in men undergoing first prostate biopsy. However, diagnostic performance is significantly improved by combining phi and PCA3. Copyright © 2012 Wiley Periodicals, Inc.

  11. The value of model averaging and dynamical climate model predictions for improving statistical seasonal streamflow forecasts over Australia

    Science.gov (United States)

    Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.

    2013-10-01

    Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.

  12. Improved score statistics for meta-analysis in single-variant and gene-level association studies.

    Science.gov (United States)

    Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo

    2018-06-01

    Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.

  13. An evaluation of the statistical significance of the association between northward turnings of the interplanetary magnetic field and substorm expansion onsets

    Science.gov (United States)

    Hsu, Tung-Shin; McPherron, R. L.

    2002-11-01

    An outstanding problem in magnetospheric physics is deciding whether substorms are always triggered by external changes in the interplanetary magnetic field (IMF) or solar wind plasma, or whether they sometimes occur spontaneously. Over the past decade, arguments have been made on both sides of this issue. In fact, there is considerable evidence that some substorms are triggered. However, equally persuasive examples of substorms with no obvious trigger have been found. Because of conflicting views on this subject, further work is required to determine whether there is a physical relation between IMF triggers and substorm onset. In the work reported here a list of substorm onsets was created using two independent substorm signatures: sudden changes in the slope of the AL index and the start of a Pi 2 pulsation burst. Possible IMF triggers were determined from ISEE-2 observations. With the ISEE spacecraft near local noon immediately upstream of the bow shock, there can be little question about propagation delay to the magnetopause or whether a particular IMF feature hits the subsolar magnetopause. Thus it eliminates the objections that the calculated arrival time is subject to a large error or that the solar wind monitor missed a potential trigger incident at the subsolar point. Using a less familiar technique, statistics of point process, we find that the time delay between substorm onsets and the propagated arrival time of IMF triggers are clustered around zero. We estimate for independent processes that the probability of this clustering by chance alone is about 10-11. If we take into account the requirement that the IMF must have been southward prior to the onset, then the probability of clustering is higher, ˜10-5, but still extremely unlikely. Thus it is not possible to ascribe the apparent relation between IMF northward turnings and substorm onset to coincidence.

  14. Lipid Replacement Therapy Drink Containing a Glycophospholipid Formulation Rapidly and Significantly Reduces Fatigue While Improving Energy and Mental Clarity

    Directory of Open Access Journals (Sweden)

    Robert Settineri

    2011-08-01

    Full Text Available Background: Fatigue is the most common complaint of patients seeking general medical care and is often treated with stimulants. It is also important in various physical activities of relatively healthy men and women, such as sports performance. Recent clinical trials using patients with chronic fatigue have shown the benefit of Lipid Replacement Therapy in restoring mitochondrial electron transport function and reducing moderate to severe chronic fatigue. Methods: Lipid Replacement Therapy was administered for the first time as an all-natural functional food drink (60 ml containing polyunsaturated glycophospholipids but devoid of stimulants or herbs to reduce fatigue. This preliminary study used the Piper Fatigue Survey instrument as well as a supplemental questionnaire to assess the effects of the glycophospholipid drink on fatigue and the acceptability of the test drink in adult men and women. A volunteer group of 29 subjects of mean age 56.2±4.5 years with various fatigue levels were randomly recruited in a clinical health fair setting to participate in an afternoon open label trial on the effects of the test drink. Results: Using the Piper Fatigue instrument overall fatigue among participants was reduced within the 3-hour seminar by a mean of 39.6% (p<0.0001. All of the subcategories of fatigue showed significant reductions. Some subjects responded within 15 minutes, and the majority responded within one hour with increased energy and activity and perceived improvements in cognitive function, mental clarity and focus. The test drink was determined to be quite acceptable in terms of taste and appearance. There were no adverse events from the energy drink during the study.Functional Foods in Health and Disease 2011; 8:245-254Conclusions: The Lipid Replacement Therapy functional food drink appeared to be a safe, acceptable and potentially useful new method to reduce fatigue, sustain energy and improve perceptions of mental function.

  15. A matter of timing: identifying significant multi-dose radiotherapy improvements by numerical simulation and genetic algorithm search.

    Directory of Open Access Journals (Sweden)

    Simon D Angus

    -effecitive means of significantly improving clinical efficacy.

  16. A matter of timing: identifying significant multi-dose radiotherapy improvements by numerical simulation and genetic algorithm search.

    Science.gov (United States)

    Angus, Simon D; Piotrowska, Monika Joanna

    2014-01-01

    of significantly improving clinical efficacy.

  17. Flavonol-rich dark cocoa significantly decreases plasma endothelin-1 and improves cognitive responses in urban children.

    Directory of Open Access Journals (Sweden)

    Lilian eCalderon-Garciduenas

    2013-08-01

    Full Text Available Air pollution exposures are linked to systemic inflammation, cardiovascular and respiratory morbidity and mortality, neuroinflammation and neuropathology in young urbanites. In particular, most Mexico City Metropolitan Area (MCMA children exhibit subtle cognitive deficits, and neuropathology studies show 40% of them exhibiting frontal tau hyperphosphorylation and 51% amyloid-β diffuse plaques (compared to 0% in low pollution control children. We assessed whether a short cocoa intervention can be effective in decreasing plasma endothelin 1 (ET-1 and/or inflammatory mediators in MCMA children. Thirty g of dark cocoa with 680 mg of total flavonols were given daily for 10.11± 3.4 days (range 9 to 24 days to 18 children (10.55yrs, SD =1.45; 11F/7M. Key metabolite ratios in frontal white matter and in hippocampus pre and during cocoa intervention were quantified by magnetic resonance spectroscopy. ET-1 significantly decreased after cocoa treatment (p=0.0002. Fifteen children (83% showed a marginally significant individual improvement in one or both of the applied simple short memory tasks. Endothelial dysfunction is a key feature of exposure to particulate matter and decreased endothelin-1 bioavailability is likely useful for brain function in the context of air pollution. Our findings suggest that cocoa interventions may be critical for early implementation of neuroprotection of highly exposed urban children. Multi-domain nutraceutical interventions could limit the risk for endothelial dysfunction, cerebral hypoperfusion, neuroinflammation, cognitive deficits, structural volumetric detrimental brain effects, and the early development of the neuropathological hallmarks of Alzheimer’s and Parkinson’s diseases.

  18. Flexible deep-ultraviolet light-emitting diodes for significant improvement of quantum efficiencies by external bending

    KAUST Repository

    Shervin, Shahab; Oh, Seung Kyu; Park, Hyun Jung; Lee, Keon Hwa; Asadirad, Mojtaba; Kim, Seung Hwan; Kim, Jeomoh; Pouladi, Sara; Lee, Sung-Nam; Li, Xiaohang; Kwak, Joon-Seop; Ryou, Jae-Hyun

    2018-01-01

    Deep ultraviolet (DUV) light at the wavelength range of 250‒280 nm (UVC spectrum) is essential for numerous applications such as sterilization, purification, sensing, and communication. III-nitride-based DUV light-emitting diodes (DUV LEDs), like other solid-state lighting sources, offer a great potential to replace the conventional gas-discharged lamps with short lifetimes and toxic-element-bearing nature. However, unlike visible LEDs, the DUV LEDs are still suffering from low quantum efficiencies (QEs) and low optical output powers. In this work, reported is a new route to improve QEs of AlGaN-based DUV LEDs using mechanical flexibility of recently developed bendable thin-film structures. Numerical studies show that electronic band structures of AlGaN heterostructures and resulting optical and electrical characteristics of the devices can be significantly modified by external bending through active control of piezoelectric polarization. Internal quantum efficiency (IQE) is enhanced higher than three times, when the DUV LEDs are moderately bent to induce in-plane compressive strain in the heterostructure. Furthermore, efficiency droop at high injection currents is mitigated and turn-on voltage of diodes decreases with the same bending condition. The concept of bendable DUV LEDs with a controlled external strain can provide a new path for high-output-power and high-efficiency devices.

  19. Flexible deep-ultraviolet light-emitting diodes for significant improvement of quantum efficiencies by external bending

    KAUST Repository

    Shervin, Shahab

    2018-01-26

    Deep ultraviolet (DUV) light at the wavelength range of 250‒280 nm (UVC spectrum) is essential for numerous applications such as sterilization, purification, sensing, and communication. III-nitride-based DUV light-emitting diodes (DUV LEDs), like other solid-state lighting sources, offer a great potential to replace the conventional gas-discharged lamps with short lifetimes and toxic-element-bearing nature. However, unlike visible LEDs, the DUV LEDs are still suffering from low quantum efficiencies (QEs) and low optical output powers. In this work, reported is a new route to improve QEs of AlGaN-based DUV LEDs using mechanical flexibility of recently developed bendable thin-film structures. Numerical studies show that electronic band structures of AlGaN heterostructures and resulting optical and electrical characteristics of the devices can be significantly modified by external bending through active control of piezoelectric polarization. Internal quantum efficiency (IQE) is enhanced higher than three times, when the DUV LEDs are moderately bent to induce in-plane compressive strain in the heterostructure. Furthermore, efficiency droop at high injection currents is mitigated and turn-on voltage of diodes decreases with the same bending condition. The concept of bendable DUV LEDs with a controlled external strain can provide a new path for high-output-power and high-efficiency devices.

  20. Novel ventilation design of combining spacer and mesh structure in sports T-shirt significantly improves thermal comfort.

    Science.gov (United States)

    Sun, Chao; Au, Joe Sau-chuen; Fan, Jintu; Zheng, Rong

    2015-05-01

    This paper reports on novel ventilation design in sports T-shirt, which combines spacer and mesh structure, and experimental evidence on the advantages of design in improving thermal comfort. Evaporative resistance (Re) and thermal insulation (Rc) of T-shirts were measured using a sweating thermal manikin under three different air velocities. Moisture permeability index (i(m)) was calculated to compare the different designed T-shirts. The T-shirts of new and conventional designs were also compared by wearer trials, which were comprised of 30 min treadmill running followed by 10 min rest. Skin temperature, skin relative humidity, heart rate, oxygen inhalation and energy expenditure were monitored, and subjective sensations were asked. Results demonstrated that novel T-shirt has 11.1% significant lower im than control sample under windy condition. The novel T-shirt contributes to reduce the variation of skin temperature and relative humidity up to 37% and 32%, as well as decrease 3.3% energy consumption during exercise. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Significant improvement of thermal stability of glucose 1-dehydrogenase by introducing disulfide bonds at the tetramer interface.

    Science.gov (United States)

    Ding, Haitao; Gao, Fen; Liu, Danfeng; Li, Zeli; Xu, Xiaohong; Wu, Min; Zhao, Yuhua

    2013-12-10

    Rational design was applied to glucose 1-dehydrogenase (LsGDH) from Lysinibacillus sphaericus G10 to improve its thermal stability by introduction of disulfide bridges between subunits. One out of the eleven mutants, designated as DS255, displayed significantly enhanced thermal stability with considerable soluble expression and high specific activity. It was extremely stable at pH ranging from 4.5 to 10.5, as it retained nearly 100% activity after incubating at different buffers for 1h. Mutant DS255 also exhibited high thermostability, having a half-life of 9900min at 50°C, which was 1868-fold as that of its wild type. Moreover, both of the increased free energy of denaturation and decreased entropy of denaturation of DS255 suggested that the enzyme structure was stabilized by the engineered disulfide bonds. On account of its robust stability, mutant DS255 would be a competitive candidate in practical applications of chiral chemicals synthesis, biofuel cells and glucose biosensors. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Imparting improvements in electrochemical sensors: evaluation of different carbon blacks that give rise to significant improvement in the performance of electroanalytical sensing platforms

    International Nuclear Information System (INIS)

    Vicentini, Fernando Campanhã; Ravanini, Amanda E.; Figueiredo-Filho, Luiz C.S.; Iniesta, Jesús; Banks, Craig E.; Fatibello-Filho, Orlando

    2015-01-01

    Three different carbon black materials have been evaluated as a potential modifier, however, only one demonstrated an improvement in the electrochemical properties. The carbon black structures were characterised with SEM, XPS and Raman spectroscopy and found to be very similar to that of amorphous graphitic materials. The modifications utilised were constructed by three different strategies (using ultrapure water, chitosan and dihexadecylphosphate). The fabricated sensors are electrochemically characterised using N,N,N',N'-tetramethyl-para-phenylenediamine and both inner-sphere and outer-sphere redox probes, namely potassium ferrocyanide(II) and hexaammineruthenium(III) chloride, in addition to the biologically relevant and electroactive analytes, dopamine (DA) and acetaminophen (AP). Comparisons are made with an edge-plane pyrolytic graphite and glassy-carbon electrode and the benefits of carbon black implemented as a modifier for sensors within electrochemistry are explored, as well as the characterisation of their electroanalytical performances. We reveal significant improvements in the electrochemical performance (excellent sensitivity, faster heterogeneous electron transfer rate (HET)) over that of a bare glassy-carbon and edge-plane pyrolytic graphite electrode and thus suggest that there are substantial advantages of using carbon black as modifier in the fabrication of electrochemical based sensors. Such work is highly important and informative for those working in the field of electroanalysis where electrochemistry can provide portable, rapid, reliable and accurate sensing protocols (bringing the laboratory into the field), with particular relevance to those searching for new electrode materials

  3. Clinical progress of human papillomavirus genotypes and their persistent infection in subjects with atypical squamous cells of undetermined significance cytology: Statistical and latent Dirichlet allocation analysis

    Science.gov (United States)

    Kim, Yee Suk; Lee, Sungin; Zong, Nansu; Kahng, Jimin

    2017-01-01

    The present study aimed to investigate differences in prognosis based on human papillomavirus (HPV) infection, persistent infection and genotype variations for patients exhibiting atypical squamous cells of undetermined significance (ASCUS) in their initial Papanicolaou (PAP) test results. A latent Dirichlet allocation (LDA)-based tool was developed that may offer a facilitated means of communication to be employed during patient-doctor consultations. The present study assessed 491 patients (139 HPV-positive and 352 HPV-negative cases) with a PAP test result of ASCUS with a follow-up period ≥2 years. Patients underwent PAP and HPV DNA chip tests between January 2006 and January 2009. The HPV-positive subjects were followed up with at least 2 instances of PAP and HPV DNA chip tests. The most common genotypes observed were HPV-16 (25.9%, 36/139), HPV-52 (14.4%, 20/139), HPV-58 (13.7%, 19/139), HPV-56 (11.5%, 16/139), HPV-51 (9.4%, 13/139) and HPV-18 (8.6%, 12/139). A total of 33.3% (12/36) patients positive for HPV-16 had cervical intraepithelial neoplasia (CIN)2 or a worse result, which was significantly higher than the prevalence of CIN2 of 1.8% (8/455) in patients negative for HPV-16 (Paged ≥51 years (38.7%) than in those aged ≤50 years (20.4%; P=0.036). Progression from persistent infection to CIN2 or worse (19/34, 55.9%) was higher than clearance (0/105, 0.0%; Page and long infection period with a clinical progression of CIN2 or worse. Therefore, LDA results may be presented as explanatory evidence during time-constrained patient-doctor consultations in order to deliver information regarding the patient's status. PMID:28587376

  4. Image quality improvements using adaptive statistical iterative reconstruction for evaluating chronic myocardial infarction using iodine density images with spectral CT.

    Science.gov (United States)

    Kishimoto, Junichi; Ohta, Yasutoshi; Kitao, Shinichiro; Watanabe, Tomomi; Ogawa, Toshihide

    2018-04-01

    Single-source dual-energy CT (ssDECT) allows the reconstruction of iodine density images (IDIs) from projection based computing. We hypothesized that adding adaptive statistical iterative reconstruction (ASiR) could improve image quality. The aim of our study was to evaluate the effect and determine the optimal blend percentages of ASiR for IDI of myocardial late iodine enhancement (LIE) in the evaluation of chronic myocardial infarction using ssDECT. A total of 28 patients underwent cardiac LIE using a ssDECT scanner. IDIs between 0 and 100% of ASiR contributions in 10% increments were reconstructed. The signal-to-noise ratio (SNR) of remote myocardia and the contrast-to-noise ratio (CNR) of infarcted myocardia were measured. Transmural extent of infarction was graded using a 5-point scale. The SNR, CNR, and transmural extent were assessed for each ASiR contribution ratio. The transmural extents were compared with MRI as a reference standard. Compared to 0% ASiR, the use of 20-100% ASiR resulted in a reduction of image noise (p ASiR images, reconstruction with 100% ASiR image showed the highest improvement in SNR (229%; p ASiR above 80% showed the highest ratio (73.7%) of accurate transmural extent classification. In conclusion, ASiR intensity of 80-100% in IDIs can improve image quality without changes in signal and maximizes the accuracy of transmural extent in infarcted myocardium.

  5. Autism according to diagnostic and statistical manual of mental disorders 5(th) edition: The need for further improvements.

    Science.gov (United States)

    Posar, Annio; Resca, Federica; Visconti, Paola

    2015-01-01

    The fifth edition of the diagnostic and statistical manual of mental disorders (DSM-5) introduced significant changes in the classification of autism spectrum disorders (ASD), including the abolition of the diagnostic subcategories proposed by DSM-IV-Text Revision. DSM-5 describes three levels of increasing severity of ASD. The authors report two explanatory cases with ASD (verbal boys, aged about 7 and a half years, without intellectual disability). According to DSM-5, both cases fall into the lowest severity level of ASD. However, their neuropsychological and neurobehavioral profile varies significantly. While the first boy showed a prevalent impairment of visuoconstructional and visuoperceptual abilities, the second one presented a predominant involvement of verbal functions, with qualitative impairments in communication. A further step forward in the definition and classification of ASD, taking into account both intensity and quality of symptoms, is recommended in order to formulate a reliable prognosis, plan an individualized treatment and monitor the clinical course over time.

  6. Strategies of statistical windows in PET image reconstruction to improve the user’s real time experience

    Science.gov (United States)

    Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.

    2017-11-01

    Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.

  7. Statistically optimal estimation of Greenland Ice Sheet mass variations from GRACE monthly solutions using an improved mascon approach

    Science.gov (United States)

    Ran, J.; Ditmar, P.; Klees, R.; Farahani, H. H.

    2018-03-01

    We present an improved mascon approach to transform monthly spherical harmonic solutions based on GRACE satellite data into mass anomaly estimates in Greenland. The GRACE-based spherical harmonic coefficients are used to synthesize gravity anomalies at satellite altitude, which are then inverted into mass anomalies per mascon. The limited spectral content of the gravity anomalies is properly accounted for by applying a low-pass filter as part of the inversion procedure to make the functional model spectrally consistent with the data. The full error covariance matrices of the monthly GRACE solutions are properly propagated using the law of covariance propagation. Using numerical experiments, we demonstrate the importance of a proper data weighting and of the spectral consistency between functional model and data. The developed methodology is applied to process real GRACE level-2 data (CSR RL05). The obtained mass anomaly estimates are integrated over five drainage systems, as well as over entire Greenland. We find that the statistically optimal data weighting reduces random noise by 35-69%, depending on the drainage system. The obtained mass anomaly time-series are de-trended to eliminate the contribution of ice discharge and are compared with de-trended surface mass balance (SMB) time-series computed with the Regional Atmospheric Climate Model (RACMO 2.3). We show that when using a statistically optimal data weighting in GRACE data processing, the discrepancies between GRACE-based estimates of SMB and modelled SMB are reduced by 24-47%.

  8. An Estimation of the Likelihood of Significant Eruptions During 2000-2009 Using Poisson Statistics on Two-Point Moving Averages of the Volcanic Time Series

    Science.gov (United States)

    Wilson, Robert M.

    2001-01-01

    Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.

  9. Cryptosporidium and other intestinal parasitic infections among HIV patients in southern Ethiopia: significance of improved HIV-related care.

    Science.gov (United States)

    Shimelis, Techalew; Tassachew, Yayehyirad; Lambiyo, Tariku

    2016-05-10

    Intestinal parasitic infections are known to cause gastroenteritis, leading to higher morbidity and mortality, particularly in people living with HIV/AIDS. This study aimed to determine the prevalence of Cryptosporidium and other intestinal parasitic infections among HIV patients receiving care at a hospital in Ethiopia where previous available baseline data helps assess if improved HIV-related care has reduced infection rates. A cross-sectional study was conducted at Hawassa University Hospital in southern Ethiopia from May, 2013 to March, 2014. A consecutive sample of 491 HIV- infected patients with diarrhea or a CD4 T cell count intestinal parasites. The study was approved by the Institutional Review Board of the College of Medicine and Health Sciences, Hawassa University. Physicians managed participants found to be infected with any pathogenic intestinal parasite. The overall prevalence of intestinal parasitic infections among the study population was 35.8 %. The most prevalent parasites were Cryptosporidium (13.2 %), followed by Entamoeba histolytica/dispar (10.2 %), and Giardia lamblia (7.9 %). The rate of single and multiple infections were 25.5 and 10.3 %, respectively. Patients with a CD4 T cell count intestinal parasitic infection or cryptosporidiosis compared to those with counts ≥ 200 cells/μl, but with some type of diarrhea. The study shows high prevalence of intestinal parasitic infections in the study population. However, the results in the current report are significantly lower compared to previous findings in the same hospital. The observed lower infection rate is encouraging and supports the need to strengthen and sustain the existing intervention measures in order to further reduce intestinal parasitic infections in people living with HIV/AIDS.

  10. Extended-release niacin/laropiprant significantly improves lipid levels in type 2 diabetes mellitus irrespective of baseline glycemic control

    Directory of Open Access Journals (Sweden)

    Bays HE

    2015-02-01

    Full Text Available Harold E Bays,1 Eliot A Brinton,2 Joseph Triscari,3 Erluo Chen,3 Darbie Maccubbin,3 Alexandra A MacLean,3 Kendra L Gibson,3 Rae Ann Ruck,3 Amy O Johnson-Levonas,3 Edward A O’Neill,3 Yale B Mitchel3 1Louisville Metabolic & Atherosclerosis Research Center (L-MARC, Louisville, KY, USA; 2Utah Foundation for Biomedical Research, Salt Lake City, UT, USA; 3Merck & Co, Inc., Whitehouse Station, NJ, USA Background: The degree of glycemic control in patients with type 2 diabetes mellitus (T2DM may alter lipid levels and may alter the efficacy of lipid-modifying agents. Objective: Evaluate the lipid-modifying efficacy of extended-release niacin/laropiprant (ERN/LRPT in subgroups of patients with T2DM with better or poorer glycemic control. Methods: Post hoc analysis of clinical trial data from patients with T2DM who were randomized 4:3 to double-blind ERN/LRPT or placebo (n=796, examining the lipid-modifying effects of ERN/LRPT in patients with glycosylated hemoglobin or fasting plasma glucose levels above and below median baseline levels. Results: At Week 12 of treatment, ERN/LRPT significantly improved low-density lipoprotein cholesterol, high-density lipoprotein cholesterol (HDL-C, non-high-density lipoprotein cholesterol, triglycerides, and lipoprotein (a, compared with placebo, with equal efficacy in patients above or below median baseline glycemic control. Compared with placebo, over 36 weeks of treatment more patients treated with ERN/LRPT had worsening of their diabetes and required intensification of antihyperglycemic medication, irrespective of baseline glycemic control. Incidences of other adverse experiences were generally low in all treatment groups. Conclusion: The lipid-modifying effects of ERN/LRPT are independent of the degree of baseline glycemic control in patients with T2DM (NCT00485758. Keywords: lipid-modifying agents, hyperglycemia, LDL, HDL, triglycerides

  11. How Good Is Good: Improved Tracking and Managing of Safety Goals, Performance Indicators, Production Targets and Significant Events Using Learning Curves

    International Nuclear Information System (INIS)

    Duffey, Rommey B.; Saull, John W.

    2002-01-01

    We show a new way to track and measure safety and performance using learning curves derived on a mathematical basis. When unusual or abnormal events occur in plants and equipment, the regulator and good management practice requires they be reported, investigated, understood and rectified. In addition to reporting so-called 'significant events', both management and the regulator often set targets for individual and collective performance, which are used for both reward and criticism. For almost completely safe systems, like nuclear power plants, commercial aircraft and chemical facilities, many parameters are tracked and measured. Continuous improvement has to be demonstrated, as well as meeting reduced occurrence rates, which are set as management goals or targets. This process usually takes the form of statistics for availability of plant and equipment, forced or unplanned maintenance outage, loss of safety function, safety or procedural violations, etc. These are often rolled up into a set of so-called 'Performance Indicators' as measures of how well safety and operation is being managed at a given facility. The overall operating standards of an industry are also measured. A whole discipline is formed of tracking, measuring, reporting, managing and understanding the plethora of indicators and data. Decreasing occurrence rates and meeting or exceeding goals are seen and rewarded as virtues. Managers and operators need to know how good is their safety management system that has been adopted and used (and paid for), and whether it can itself be improved. We show the importance of accumulated experience in correctly measuring and tracking the decreasing event and error rates speculating a finite minimum rate. We show that the rate of improvement constitutes a measurable 'learning curve', and the attainment of the goals and targets can be affected by the adopted measures. We examine some of the available data on significant events, reportable occurrences, and loss of

  12. Improving recognition of late life anxiety disorders in Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition: observations and recommendations of the Advisory Committee to the Lifespan Disorders Work Group

    NARCIS (Netherlands)

    Mohlman, J.; Bryant, C.; Lenze, E.J.; Stanley, M.A.; Gum, A.; Flint, A.; Beekman, A.T.F.; Wetherell, J.L.; Thorp, S.R.; Craske, MG

    2012-01-01

    Background Recognition of the significance of anxiety disorders in older adults is growing. The revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM) provides a timely opportunity to consider potential improvements to diagnostic criteria for psychiatric disorders for use with

  13. Interventions that effectively target Anopheles funestus mosquitoes could significantly improve control of persistent malaria transmission in south-eastern Tanzania.

    Science.gov (United States)

    Kaindoa, Emmanuel W; Matowo, Nancy S; Ngowo, Halfan S; Mkandawile, Gustav; Mmbando, Arnold; Finda, Marcelina; Okumu, Fredros O

    2017-01-01

    An. arabiensis (44.1%). Though An. arabiensis is still the most abundant vector species here, the remaining malaria transmission is predominantly mediated by An. funestus, possibly due to high insecticide resistance and high survival probabilities. Interventions that effectively target An. funestus mosquitoes could therefore significantly improve control of persistent malaria transmission in south-eastern Tanzania.

  14. Interventions that effectively target Anopheles funestus mosquitoes could significantly improve control of persistent malaria transmission in south–eastern Tanzania

    Science.gov (United States)

    Matowo, Nancy S.; Ngowo, Halfan S.; Mkandawile, Gustav; Mmbando, Arnold; Finda, Marcelina; Okumu, Fredros O.

    2017-01-01

    An. arabiensis (44.1%). Though An. arabiensis is still the most abundant vector species here, the remaining malaria transmission is predominantly mediated by An. funestus, possibly due to high insecticide resistance and high survival probabilities. Interventions that effectively target An. funestus mosquitoes could therefore significantly improve control of persistent malaria transmission in south–eastern Tanzania. PMID:28542335

  15. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  16. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  17. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant...... as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved...

  18. Significant survival improvement of patients with recurrent breast cancer in the periods 2001-2008 vs. 1992-2000

    Directory of Open Access Journals (Sweden)

    Nishimura Sumiko

    2011-03-01

    Full Text Available Abstract Background It is unclear whether individualized treatments based on biological factors have improved the prognosis of recurrent breast cancer. The purpose of this study is to evaluate the survival improvement of patients with recurrent breast cancer after the introduction of third generation aromatase inhibitors (AIs and trastuzumab. Methods A total of 407 patients who received first diagnosis of recurrent breast cancer and treatment at National Kyushu Cancer Center between 1992 and 2008 were retrospectively evaluated. As AIs and trastuzumab were approved for clinical use in Japan in 2001, the patients were divided into two time cohorts depending on whether the cancer recurred before or after 2001. Cohort A: 170 patients who were diagnosed between 1992 and 2000. Cohort B: 237 patients who were diagnosed between 2001 and 2008. Tumor characteristics, treatments, and outcome were compared. Results Fourteen percent of cohort A and 76% of cohort B received AIs and/or trastuzumab (P Conclusions The prognosis of patients with recurrent breast cancer was improved over time following the introduction of AIs and trastuzumab and the survival improvement was apparent in HR- and/or HER-2-positive tumors.

  19. Significant Improvement in Sleep in People with Intellectual Disabilities Living in Residential Settings by Non-Pharmaceutical Interventions

    Science.gov (United States)

    Hylkema, T.; Vlaskamp, C.

    2009-01-01

    Background: Although about 15 to 50 percent of people with intellectual disabilities (ID) living in residential settings suffer from sleep problems, scant attention is paid to these problems. Most available studies focus on pharmaceutical solutions. In this study we focus on improving sleep in people with intellectual disabilities living in…

  20. Improve projections of changes in southern African summer rainfall through comprehensive multi-timescale empirical statistical downscaling

    Science.gov (United States)

    Dieppois, B.; Pohl, B.; Eden, J.; Crétat, J.; Rouault, M.; Keenlyside, N.; New, M. G.

    2017-12-01

    The water management community has hitherto neglected or underestimated many of the uncertainties in climate impact scenarios, in particular, uncertainties associated with decadal climate variability. Uncertainty in the state-of-the-art global climate models (GCMs) is time-scale-dependant, e.g. stronger at decadal than at interannual timescales, in response to the different parameterizations and to internal climate variability. In addition, non-stationarity in statistical downscaling is widely recognized as a key problem, in which time-scale dependency of predictors plays an important role. As with global climate modelling, therefore, the selection of downscaling methods must proceed with caution to avoid unintended consequences of over-correcting the noise in GCMs (e.g. interpreting internal climate variability as a model bias). GCM outputs from the Coupled Model Intercomparison Project 5 (CMIP5) have therefore first been selected based on their ability to reproduce southern African summer rainfall variability and their teleconnections with Pacific sea-surface temperature across the dominant timescales. In observations, southern African summer rainfall has recently been shown to exhibit significant periodicities at the interannual timescale (2-8 years), quasi-decadal (8-13 years) and inter-decadal (15-28 years) timescales, which can be interpret as the signature of ENSO, the IPO, and the PDO over the region. Most of CMIP5 GCMs underestimate southern African summer rainfall variability and their teleconnections with Pacific SSTs at these three timescales. In addition, according to a more in-depth analysis of historical and pi-control runs, this bias is might result from internal climate variability in some of the CMIP5 GCMs, suggesting potential for bias-corrected prediction based empirical statistical downscaling. A multi-timescale regression based downscaling procedure, which determines the predictors across the different timescales, has thus been used to

  1. Brief Communication: Upper Air Relaxation in RACMO2 Significantly Improves Modelled Interannual Surface Mass Balance Variability in Antarctica

    Science.gov (United States)

    van de Berg, W. J.; Medley, B.

    2016-01-01

    The Regional Atmospheric Climate Model (RACMO2) has been a powerful tool for improving surface mass balance (SMB) estimates from GCMs or reanalyses. However, new yearly SMB observations for West Antarctica show that the modelled interannual variability in SMB is poorly simulated by RACMO2, in contrast to ERA-Interim, which resolves this variability well. In an attempt to remedy RACMO2 performance, we included additional upper-air relaxation (UAR) in RACMO2. With UAR, the correlation to observations is similar for RACMO2 and ERA-Interim. The spatial SMB patterns and ice-sheet-integrated SMB modelled using UAR remain very similar to the estimates of RACMO2 without UAR. We only observe an upstream smoothing of precipitation in regions with very steep topography like the Antarctic Peninsula. We conclude that UAR is a useful improvement for regional climate model simulations, although results in regions with steep topography should be treated with care.

  2. Improving winter leaf area index estimation in evergreen coniferous forests and its significance in carbon and water fluxes modeling

    Science.gov (United States)

    Wang, R.; Chen, J. M.; Luo, X.

    2016-12-01

    Modeling of carbon and water fluxes at the continental and global scales requires remotely sensed LAI as inputs. For evergreen coniferous forests (ENF), severely underestimated winter LAI has been one of the issues for mostly available remote sensing products, which could cause negative bias in the modeling of Gross Primary Productivity (GPP) and evapotranspiration (ET). Unlike deciduous trees which shed all the leaves in winter, conifers retains part of their needles and the proportion of the retained needles depends on the needle longevity. In this work, the Boreal Ecosystem Productivity Simulator (BEPS) was used to model GPP and ET at eight FLUXNET Canada ENF sites. Two sets of LAI were used as the model inputs: the 250m 10-day University of Toronto (U of T) LAI product Version 2 and the corrected LAI based on the U of T LAI product and the needle longevity of the corresponding tree species at individual sites. Validating model daily GPP (gC/m2) against site measurements, the mean RMSE over eight sites decreases from 1.85 to 1.15, and the bias changes from -0.99 to -0.19. For daily ET (mm), mean RMSE decreases from 0.63 to 0.33, and the bias changes from -0.31 to -0.16. Most of the improvements occur in the beginning and at the end of the growing season when there is large correction of LAI and meanwhile temperature is still suitable for photosynthesis and transpiration. For the dormant season, the improvement in ET simulation mostly comes from the increased interception of precipitation brought by the elevated LAI during that time. The results indicate that model performance can be improved by the application the corrected LAI. Improving the winter RS LAI can make a large impact on land surface carbon and energy budget.

  3. Improving Classification of Airborne Laser Scanning Echoes in the Forest-Tundra Ecotone Using Geostatistical and Statistical Measures

    Directory of Open Access Journals (Sweden)

    Nadja Stumberg

    2014-05-01

    Full Text Available The vegetation in the forest-tundra ecotone zone is expected to be highly affected by climate change and requires effective monitoring techniques. Airborne laser scanning (ALS has been proposed as a tool for the detection of small pioneer trees for such vast areas using laser height and intensity data. The main objective of the present study was to assess a possible improvement in the performance of classifying tree and nontree laser echoes from high-density ALS data. The data were collected along a 1000 km long transect stretching from southern to northern Norway. Different geostatistical and statistical measures derived from laser height and intensity values were used to extent and potentially improve more simple models ignoring the spatial context. Generalised linear models (GLM and support vector machines (SVM were employed as classification methods. Total accuracies and Cohen’s kappa coefficients were calculated and compared to those of simpler models from a previous study. For both classification methods, all models revealed total accuracies similar to the results of the simpler models. Concerning classification performance, however, the comparison of the kappa coefficients indicated a significant improvement for some models both using GLM and SVM, with classification accuracies >94%.

  4. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  5. The optimal monochromatic spectral computed tomographic imaging plus adaptive statistical iterative reconstruction algorithm can improve the superior mesenteric vessel image quality

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Xiao-Ping; Zuo, Zi-Wei; Xu, Ying-Jin; Wang, Jia-Ning [CT/MRI room, Affiliated Hospital of Hebei University, Baoding, Hebei, 071000 (China); Liu, Huai-Jun, E-mail: hebeiliu@outlook.com [Department of Medical Imaging, The Second Hospital of Hebei Medical University, Shijiazhuang, Hebei, 050000 (China); Liang, Guang-Lu [CT/MRI room, Affiliated Hospital of Hebei University, Baoding, Hebei, 071000 (China); Gao, Bu-Lang, E-mail: browngao@163.com [Department of Medical Research, Shijiazhuang First Hospital, Shijiazhuang, Hebei, 050011 (China)

    2017-04-15

    Objective: To investigate the effect of the optimal monochromatic spectral computed tomography (CT) plus adaptive statistical iterative reconstruction on the improvement of the image quality of the superior mesenteric artery and vein. Materials and methods: The gemstone spectral CT angiographic data of 25 patients were reconstructed in the following three groups: 70 KeV, the optimal monochromatic imaging, and the optimal monochromatic plus 40%iterative reconstruction mode. The CT value, image noises (IN), background CT value and noises, contrast-to-noise ratio (CNR), signal-to-noise ratio (SNR) and image scores of the vessels and surrounding tissues were analyzed. Results: In the 70 KeV, the optimal monochromatic and the optimal monochromatic images plus 40% iterative reconstruction group, the mean scores of image quality were 3.86, 4.24 and 4.25 for the superior mesenteric artery and 3.46, 3.78 and 3.81 for the superior mesenteric vein, respectively. The image quality scores for the optimal monochromatic and the optimal monochromatic plus 40% iterative reconstruction groups were significantly greater than for the 70 KeV group (P < 0.05). The vascular CT value, image noise, background noise, CNR and SNR were significantly (P < 0.001) greater in the optimal monochromatic and the optimal monochromatic images plus 40% iterative reconstruction group than in the 70 KeV group. The optimal monochromatic plus 40% iterative reconstruction group had significantly (P < 0.05) lower image and background noise but higher CNR and SNR than the other two groups. Conclusion: The optimal monochromatic imaging combined with 40% iterative reconstruction using low-contrast agent dosage and low injection rate can significantly improve the image quality of the superior mesenteric artery and vein.

  6. Migration ability and Toll-like receptor expression of human mesenchymal stem cells improves significantly after three-dimensional culture.

    Science.gov (United States)

    Zhou, Panpan; Liu, Zilin; Li, Xue; Zhang, Bing; Wang, Xiaoyuan; Lan, Jing; Shi, Qing; Li, Dong; Ju, Xiuli

    2017-09-16

    While the conventional two-dimensional (2D) culture protocol is well accepted for the culture of mesenchymal stem cells (MSCs), this method fails to recapitulate the in vivo native three-dimensional (3D) cellular microenvironment, and may result in phenotypic changes, and homing and migration capacity impairments. MSC preparation in 3D culture systems has been considered an attractive preparatory and delivery method recently. We seeded human umbilical cord-derived MSCs (hUCMSCs) in a 3D culture system with porcine acellular dermal matrix (PADM), and investigated the phenotypic changes, the expression changes of some important receptors, including Toll-like receptors (TLRs) and C-X-C chemokine receptor type 4 (CXCR4) when hUCMSCs were transferred from 2D to 3D systems, as well as the alterations in in vivo homing and migration potential. It was found that the percentage of CD105-positive cells decreased significantly, whereas that of CD34- and CD271-positive cells increased significantly in 3D culture, compared to that in 2D culture. The mRNA and protein expression levels of TLR2, TLR3, TLR4, TLR6, and CXCR4 in hUCMSCs were increased significantly upon culturing with PADM for 3 days, compared to the levels in 2D culture. The numbers of migratory 3D hUCMSCs in the heart, liver, spleen, and bone marrow were significantly greater than the numbers of 2D hUCMSCs, and the worst migration occurred in 3D + AMD3100 (CXCR4 antagonist) hUCMSCs. These results suggested that 3D culture of hUCMSCs with PADM could alter the phenotypic characteristics of hUCMSCs, increase their TLR and CXCR4 expression levels, and promote their migratory and homing capacity in which CXCR4 plays an important role. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. The patient's safety - For a dynamics of improvement. Nr 3. How to analyze your significant radiation protection events?

    International Nuclear Information System (INIS)

    2012-07-01

    The objective of this publication is to present event analysis methods which are the most frequently used by radiotherapy departments. After an indication of key figures concerning radiotherapy patients, sessions and events, the document indicates the objectives and steps of an event analysis. It presents various analysis methods: Ishikawa diagram (or 5M method or causes-effects diagram), the cause tree, the ALARM method (association of litigation and risk management), the ORION method. It proposes a comparison between these five methods, their possibilities, their limits. Good practices are outlined in terms of data acquisition, method choice, event analysis, and improvement actions. The use of the cause tree analysis is commented by members of the Limoges hospital radiotherapy department, and that of the Ishikawa method by a member of the Beauvais hospital

  8. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    Science.gov (United States)

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  9. Different surgical strategies for chronic pancreatitis significantly improve long-term outcome: a comparative single center study

    Directory of Open Access Journals (Sweden)

    Hildebrand P

    2010-08-01

    Full Text Available Abstract Objective In general, chronic pancreatitis (CP primarily requires conservative treatment. The chronic pain syndrome and complications make patients seek surgical advice, frequently after years of progression. In the past, surgical procedures involving drainage as well as resection have been employed successfully. The present study compared the different surgical strategies. Patients and Methods From March 2000 until April 2005, a total of 51 patients underwent surgical treatment for CP at the Department of surgery, University of Schleswig-Holstein, Campus Lübeck. Out of those 51 patients, 39 (76.5% were operated according to the Frey procedure, and in 12 cases (23.5% the Whipple procedure was performed. Patient data were documented prospectively throughout the duration of the hospital stay. The evaluation of the postoperative pain score was carried out retrospectively with a validated questionnaire. Results Average operating time was 240 minutes for the Frey group and 411 minutes for the Whipple group. The medium number of blood transfusions was 1 in the Frey group and 4.5 in the Whipple group. Overall morbidity was 21% in the Frey group and 42% in the Whipple group. 30-day mortality was zero for all patients. During the median follow-up period of 50 months, an improvement in pain score was observed in 93% of the patients of the Frey group and 67% of the patients treated according to the Whipple procedure. Conclusion The results show that both the Frey procedure as well as partial pancreaticoduodenectomy are capable of improving chronic pain symptoms in CP. As far as later endocrine and exocrine pancreatic insufficiency is concerned, however, the extended drainage operation according to Frey proves to be advantageous compared to the traditional resection procedure by Whipple. Accordingly, the Frey procedure provides us with an organ-preserving surgical procedure which treats the complications of CP sufficiently, thus being an

  10. Significant improvement of bone mineral density by denosumab treatment in Japanese osteoporotic patients following breast cancer treatment

    Directory of Open Access Journals (Sweden)

    Nakamura Y

    2018-03-01

    Full Text Available Yukio Nakamura,1,2 Mikio Kamimura,3 Akio Morikawa,4 Akira Taguchi,5 Takako Suzuki,1 Hiroyuki Kato1 1Department of Orthopaedic Surgery, Shinshu University School of Medicine, Matsumoto, 2Department of Orthopedic Surgery, Showa-Inan General Hospital, Komagane, 3Center for Osteoporosis and Spinal Disorders, Kamimura Orthopaedic Clinic, Matsumoto, 4Department of Surgery, Showa-Inan General Hospital, Komagane, 5Department of Oral and Maxillofacial Radiology, School of Dentistry, Matsumoto Dental University, Shiojiri, Japan Background: The aim of this study was to evaluate the effects of denosumab in patients with osteoporosis (OP and non-metastatic breast cancer following treatment of 1 surgery, 2 surgery and aromatase inhibitors, and 3 surgery, aromatase inhibitors, and anti-cancer agents, compared with those in primary OP patients. Patients and methods: In this retrospective 24-month study, patients were divided into the primary OP group (34 cases or OP receiving breast cancer treatment group (breast cancer group; 17 cases. We measured serum calcium, whole parathyroid hormone (PTH, 1,25OH2D3, bone alkaline phosphatase (BAP, tartrate-resistant acid phosphatase-5b (TRACP-5b, and bone mineral density (BMD of the lumbar 1–4 vertebrae (L-BMD and bilateral total hips (H-BMD for 24 months. Results: The percent changes of serum calcium in the breast cancer group were significantly lower than those in the primary OP group at 1 week, 1 and 12 months. The percent changes of whole PTH in the primary OP group were significantly lower than those in the breast cancer group at 2 and 4 months. Significant differences were found between the groups at 18 months (-34.5% in the primary OP group and -52.6% in the breast cancer group, respectively for the percent changes of BAP. Significant differences were found between the groups at 12, 18, and 24 months (-39.7% in the primary OP group and -64.0% in the breast cancer group at 24 months, respectively for the percent

  11. Significant Improvement Selected Mediators of Inflammation in Phenotypes of Women with PCOS after Reduction and Low GI Diet

    Directory of Open Access Journals (Sweden)

    Małgorzata Szczuko

    2017-01-01

    Full Text Available Many researchers suggest an increased risk of atherosclerosis in women with polycystic ovary syndrome. In the available literature, there are no studies on the mediators of inflammation in women with PCOS, especially after dietary intervention. Eicosanoids (HETE and HODE were compared between the biochemical phenotypes of women with PCOS (normal and high androgens and after the 3-month reduction diet. Eicosanoid profiles (9(S-HODE, 13(S-HODE, 5(S-HETE, 12(S-HETE, 15(S-HETE, 5(S-oxoETE, 16(R-HETE, 16(S-HETE and 5(S, 6(R-lipoxin A4, 5(S, 6(R, 15(R-lipoxin A4 were extracted from 0.5 ml of plasma using solid-phase extraction RP-18 SPE columns. The HPLC separations were performed on a 1260 liquid chromatograph. No significant differences were found in the concentration of analysed eicosanoids in phenotypes of women with PCOS. These women, however, have significantly lower concentration of inflammatory mediators than potentially healthy women from the control group. Dietary intervention leads to a significant (p<0.01 increase in the synthesis of proinflammatory mediators, reaching similar levels as in the control group. The development of inflammatory reaction in both phenotypes of women with PCOS is similar. The pathways for synthesis of proinflammatory mediators in women with PCOS are dormant, but can be stimulated through a reduction diet. Three-month period of lifestyle change may be too short to stimulate the pathways inhibiting inflammatory process.

  12. Drilling for improvement : Statoil and Halliburton report significant cost savings and more accurate well placement at the Leismer demonstration project

    Energy Technology Data Exchange (ETDEWEB)

    Jaremko, D.

    2010-07-15

    This article discussed new improvements in steam assisted gravity drainage (SAGD) made by Statoil and Halliburton at the Leismer demonstration project. The Leismer project is Statoil's inaugural project in oil sands development, and will have a capacity to produce 10,000 barrels per day through 4 separate well pads with 23 well pairs. Challenges to the project included the long lateral sections required for the well pairs to remain parallel to each other while remaining within the target formation. An azimuthal deep resistivity (ADR) tool was used to detect the proximity of the wellbore to shale and water zones. Use of the tool allowed operators to modify the planned well trajectory in order to optimize placements within the reservoir. A rotary steerable system (RSS) was used increase injection times. The project was completed 6 to 8 weeks ahead of schedule. Applications have now been filed for a further 10 phases that will produce 240,000 barrels per day. 1 fig.

  13. Icariin combined with human umbilical cord mesenchymal stem cells significantly improve the impaired kidney function in chronic renal failure.

    Science.gov (United States)

    Li, Wen; Wang, Li; Chu, Xiaoqian; Cui, Huantian; Bian, Yuhong

    2017-04-01

    At present, the main therapy for chronic renal failure (CRF) is dialysis and renal transplantation, but neither obtains satisfactory results. Human umbilical cord mesenchymal stem cells (huMSCs) are isolated from the fetal umbilical cord which has a high self-renewal and multi-directional differentiation potential. Icariin (ICA), a kidney-tonifying Chinese Medicine can enhance the multipotency of huMSCs. Therefore, this work seeks to employ the use of ICA-treated huMSCs for the treatment of chronic renal failure. Blood urea nitrogen and creatinine (Cr) analyses showed amelioration of functional parameters in ICA-treated huMSCs for the treatment of CRF rats at 3, 7, and 14 days after transplantation. ICA-treated huMSCs can obviously increase the number of cells in injured renal tissues at 3, 7, and 14 days after transplantation by optical molecular imaging system. Hematoxylin-eosin staining demonstrated that ICA-treated huMSCs reduced the levels of fibrosis in CRF rats at 14 days after transplantation. Superoxide dismutase and Malondialdehyde analyses showed that ICA-treated huMSCs reduced the oxidative damage in CRF rats. Moreover, transplantation with ICA-treated huMSCs decreased inflammatory responses, promoted the expression of growth factors, and protected injured renal tissues. Taken together, our findings suggest that ICA-treated huMSCs could improve the kidney function in CRF rats.

  14. Improvement of Fabry Disease-Related Gastrointestinal Symptoms in a Significant Proportion of Female Patients Treated with Agalsidase Beta

    DEFF Research Database (Denmark)

    Wilcox, William R; Feldt-Rasmussen, Ulla; Martins, Ana Maria

    2018-01-01

    of organ involvement. Although variable, gastrointestinal symptoms are among the most common and significant early clinical manifestations; they tend to persist into adulthood if left untreated. To further understand the effects of sustained enzyme replacement therapy (ERT) with agalsidase beta......Fabry disease, an X-linked inherited lysosomal storage disorder, is caused by mutations in the gene encoding α-galactosidase, GLA. In patients with Fabry disease, glycosphingolipids accumulate in various cell types, triggering a range of cellular and tissue responses that result in a wide spectrum...... on gastrointestinal symptoms in heterozygotes, a data analysis of female patients enrolled in the Fabry Registry was conducted. To be included, females of any age must have received agalsidase beta (average dose 1.0 mg/kg every 2 weeks) for at least 2.5 years. Measured outcomes were self-reported gastrointestinal...

  15. Towards an improved prediction of the free radical scavenging potency of flavonoids: the significance of double PCET mechanisms.

    Science.gov (United States)

    Amić, Ana; Marković, Zoran; Dimitrić Marković, Jasmina M; Stepanić, Višnja; Lučić, Bono; Amić, Dragan

    2014-01-01

    The 1H(+)/1e(-) and 2H(+)/2e(-) proton-coupled electron transfer (PCET) processes of free radical scavenging by flavonoids were theoretically studied for aqueous and lipid environments using the PM6 and PM7 methods. The results reported here indicate that the significant contribution of the second PCET mechanism, resulting in the formation of a quinone/quinone methide, effectively discriminates the active from inactive flavonoids. The predictive potency of descriptors related to the energetics of second PCET mechanisms (the second O-H bond dissociation enthalpy (BDE2) related to hydrogen atom transfer (HAT) mechanism, and the second electron transfer enthalpy (ETE2) related to sequential proton loss electron transfer (SPLET) mechanism) are superior to the currently used indices, which are related to the first 1H(+)/1e(-) processes, and could serve as primary descriptors in development of the QSAR (quantitative structure-activity relationships) of flavonoids. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Pt-decorated GaN nanowires with significant improvement in H2 gas-sensing performance at room temperature.

    Science.gov (United States)

    Abdullah, Q N; Yam, F K; Hassan, Z; Bououdina, M

    2015-12-15

    Superior sensitivity towards H2 gas was successfully achieved with Pt-decorated GaN nanowires (NWs) gas sensor. GaN NWs were fabricated via chemical vapor deposition (CVD) route. Morphology (field emission scanning electron microscopy and transmission electron microscopy) and crystal structure (high resolution X-ray diffraction) characterizations of the as-synthesized nanostructures demonstrated the formation of GaN NWs having a wurtzite structure, zigzaged shape and an average diameter of 30-166nm. The Pt-decorated GaN NWs sensor shows a high response of 250-2650% upon exposure to H2 gas concentration from 7 to 1000ppm respectively at room temperature (RT), and then increases to about 650-4100% when increasing the operating temperature up to 75°C. The gas-sensing measurements indicated that the Pt-decorated GaN NWs based sensor exhibited efficient detection of H2 at low concentration with excellent sensitivity, repeatability, and free hysteresis phenomena over a period of time of 100min. The large surface-to-volume ratio of GaN NWs and the catalytic activity of Pt metal are the most influential factors leading to the enhancement of H2 gas-sensing performances through the improvement of the interaction between the target molecules (H2) and the sensing NWs surface. The attractive low-cost, low power consumption and high-performance of the resultant decorated GaN NWs gas sensor assure their uppermost potential for H2 gas sensor working at low operating temperature. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. An Evidence-Based Education Program For Adults About Child Sexual Abuse (“Prevent It!” Significantly Improves Behaviours As Well As Attitudes And Knowledge

    Directory of Open Access Journals (Sweden)

    Erin K Martin

    2016-08-01

    Full Text Available Here we describe the development of an evidence-based education program for adults about childhood sexual abuse (CSA, called Prevent It! Uniquely, the primary goal of this program was to change the behaviour of participants, as well as to increase knowledge about CSA and positive attitudes towards it. A comprehensive review shows no previous similar approach. The program includes a detailed manual to allow standardized administration by trained facilitators, as well as multiple video segments from CSA survivors and professionals. A total of 23 program workshops were run, with 366 adults participating. Of these, 312 (85% agreed to take part in the study. All completed baseline ratings prior to the program and 195 (63% of study sample completed follow-up assessments at 3-months. There were no significant differences between the demographic make-up of the baseline group and the follow-up group. Assessments included demographic data, knowledge, attitudes, and several measures of behaviour (our primary outcome variable. Behavioural questions asked individuals to select behaviours used in the previous 3-months from a list of options. Questions also included asking how many times in the previous 3-months have you talked about healthy sexual development or child sexual abuse with a child you know; suspected a child was sexually abused; taken steps to protect a child; or reported suspected sexual abuse to police or child welfare? The majority of attendees were women, with the commonest age group being between 30 – 39 years old. Approximately 33% had experienced CSA themselves. At 3-month follow-up there were highly statistically significant improvements in several aspects of behaviour and knowledge, and attitudes regarding CSA. For example, the number of subjects actively looking for evidence of CSA increased from 46% at baseline to 81% at follow-up, while the number of subjects who actively took steps to protect children increased from 25% at baseline

  18. Amphoteric Ion-Exchange Membranes with Significantly Improved Vanadium Barrier Properties for All-Vanadium Redox Flow Batteries.

    Science.gov (United States)

    Nibel, Olga; Rojek, Tomasz; Schmidt, Thomas J; Gubler, Lorenz

    2017-07-10

    All-vanadium redox flow batteries (VRBs) have attracted considerable interest as promising energy-storage devices that can allow the efficient utilization of renewable energy sources. The membrane, which separates the porous electrodes in a redox flow cell, is one of the key components in VRBs. High rates of crossover of vanadium ions and water through the membrane impair the efficiency and capacity of a VRB. Thus, membranes with low permeation rate of vanadium species and water are required, also characterized by low resistance and stability in the VRB environment. Here, we present a new design concept for amphoteric ion-exchange membranes, based on radiation-induced grafting of vinylpyridine into an ethylene tetrafluoroethylene base film and a two-step functionalization to introduce cationic and anionic exchange sites, respectively. During long-term cycling, redox flow cells containing these membranes showed higher efficiency, less pronounced electrolyte imbalance, and significantly reduced capacity decay compared to the cells with the benchmark material Nafion 117. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Short separation regression improves statistical significance and better localizes the hemodynamic response obtained by near-infrared spectroscopy for tasks with differing autonomic responses.

    Science.gov (United States)

    Yücel, Meryem A; Selb, Juliette; Aasted, Christopher M; Petkov, Mike P; Becerra, Lino; Borsook, David; Boas, David A

    2015-07-01

    Autonomic nervous system response is known to be highly task-dependent. The sensitivity of near-infrared spectroscopy (NIRS) measurements to superficial layers, particularly to the scalp, makes it highly susceptible to systemic physiological changes. Thus, one critical step in NIRS data processing is to remove the contribution of superficial layers to the NIRS signal and to obtain the actual brain response. This can be achieved using short separation channels that are sensitive only to the hemodynamics in the scalp. We investigated the contribution of hemodynamic fluctuations due to autonomous nervous system activation during various tasks. Our results provide clear demonstrations of the critical role of using short separation channels in NIRS measurements to disentangle differing autonomic responses from the brain activation signal of interest.

  20. Post-exposure Treatment with Anti-rabies VHH and Vaccine Significantly Improves Protection of Mice from Lethal Rabies Infection

    Science.gov (United States)

    Terryn, Sanne; Francart, Aurélie; Rommelaere, Heidi; Stortelers, Catelijne; Van Gucht, Steven

    2016-01-01

    Post-exposure prophylaxis (PEP) against rabies infection consists of a combination of passive immunisation with plasma-derived human or equine immune globulins and active immunisation with vaccine delivered shortly after exposure. Since anti-rabies immune globulins are expensive and scarce, there is a need for cheaper alternatives that can be produced more consistently. Previously, we generated potent virus-neutralising VHH, also called Nanobodies, against the rabies glycoprotein that are effectively preventing lethal disease in an in vivo mouse model. The VHH domain is the smallest antigen-binding functional fragment of camelid heavy chain-only antibodies that can be manufactured in microbial expression systems. In the current study we evaluated the efficacy of half-life extended anti-rabies VHH in combination with vaccine for PEP in an intranasal rabies infection model in mice. The PEP combination therapy of systemic anti-rabies VHH and intramuscular vaccine significantly delayed the onset of disease compared to treatment with anti-rabies VHH alone, prolonged median survival time (35 versus 14 days) and decreased mortality (60% versus 19% survival rate), when treated 24 hours after rabies virus challenge. Vaccine alone was unable to rescue mice from lethal disease. As reported also for immune globulins, some interference of anti-rabies VHH with the antigenicity of the vaccine was observed, but this did not impede the synergistic effect. Post exposure treatment with vaccine and human anti-rabies immune globulins was unable to protect mice from lethal challenge. Anti-rabies VHH and vaccine act synergistically to protect mice after rabies virus exposure, which further validates the possible use of anti-rabies VHH for rabies PEP. PMID:27483431

  1. Post-exposure Treatment with Anti-rabies VHH and Vaccine Significantly Improves Protection of Mice from Lethal Rabies Infection.

    Directory of Open Access Journals (Sweden)

    Sanne Terryn

    2016-08-01

    Full Text Available Post-exposure prophylaxis (PEP against rabies infection consists of a combination of passive immunisation with plasma-derived human or equine immune globulins and active immunisation with vaccine delivered shortly after exposure. Since anti-rabies immune globulins are expensive and scarce, there is a need for cheaper alternatives that can be produced more consistently. Previously, we generated potent virus-neutralising VHH, also called Nanobodies, against the rabies glycoprotein that are effectively preventing lethal disease in an in vivo mouse model. The VHH domain is the smallest antigen-binding functional fragment of camelid heavy chain-only antibodies that can be manufactured in microbial expression systems. In the current study we evaluated the efficacy of half-life extended anti-rabies VHH in combination with vaccine for PEP in an intranasal rabies infection model in mice. The PEP combination therapy of systemic anti-rabies VHH and intramuscular vaccine significantly delayed the onset of disease compared to treatment with anti-rabies VHH alone, prolonged median survival time (35 versus 14 days and decreased mortality (60% versus 19% survival rate, when treated 24 hours after rabies virus challenge. Vaccine alone was unable to rescue mice from lethal disease. As reported also for immune globulins, some interference of anti-rabies VHH with the antigenicity of the vaccine was observed, but this did not impede the synergistic effect. Post exposure treatment with vaccine and human anti-rabies immune globulins was unable to protect mice from lethal challenge. Anti-rabies VHH and vaccine act synergistically to protect mice after rabies virus exposure, which further validates the possible use of anti-rabies VHH for rabies PEP.

  2. Constitutive overexpression of the TaNF-YB4 gene in transgenic wheat significantly improves grain yield.

    Science.gov (United States)

    Yadav, Dinesh; Shavrukov, Yuri; Bazanova, Natalia; Chirkova, Larissa; Borisjuk, Nikolai; Kovalchuk, Nataliya; Ismagul, Ainur; Parent, Boris; Langridge, Peter; Hrmova, Maria; Lopato, Sergiy

    2015-11-01

    Heterotrimeric nuclear factors Y (NF-Ys) are involved in regulation of various vital functions in all eukaryotic organisms. Although a number of NF-Y subunits have been characterized in model plants, only a few have been functionally evaluated in crops. In this work, a number of genes encoding NF-YB and NF-YC subunits were isolated from drought-tolerant wheat (Triticum aestivum L. cv. RAC875), and the impact of the overexpression of TaNF-YB4 in the Australian wheat cultivar Gladius was investigated. TaNF-YB4 was isolated as a result of two consecutive yeast two-hybrid (Y2H) screens, where ZmNF-YB2a was used as a starting bait. A new NF-YC subunit, designated TaNF-YC15, was isolated in the first Y2H screen and used as bait in a second screen, which identified two wheat NF-YB subunits, TaNF-YB2 and TaNF-YB4. Three-dimensional modelling of a TaNF-YB2/TaNF-YC15 dimer revealed structural determinants that may underlie interaction selectivity. The TaNF-YB4 gene was placed under the control of the strong constitutive polyubiquitin promoter from maize and introduced into wheat by biolistic bombardment. The growth and yield components of several independent transgenic lines with up-regulated levels of TaNF-YB4 were evaluated under well-watered conditions (T1-T3 generations) and under mild drought (T2 generation). Analysis of T2 plants was performed in large deep containers in conditions close to field trials. Under optimal watering conditions, transgenic wheat plants produced significantly more spikes but other yield components did not change. This resulted in a 20-30% increased grain yield compared with untransformed control plants. Under water-limited conditions transgenic lines maintained parity in yield performance. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  3. Marrying Step Feed with Secondary Clarifier Improvements to Significantly Increase Peak Wet Weather Treatment Capacity: An Integrated Methodology.

    Science.gov (United States)

    Daigger, Glen T; Siczka, John S; Smith, Thomas F; Frank, David A; McCorquodale, J A

    2017-08-01

      The need to increase the peak wet weather secondary treatment capacity of the City of Akron, Ohio, Water Reclamation Facility (WRF) provided the opportunity to test an integrated methodology for maximizing the peak wet weather secondary treatment capacity of activated sludge systems. An initial investigation, consisting of process modeling of the secondary treatment system and computational fluid dynamics (CFD) analysis of the existing relatively shallow secondary clarifiers (3.3 and 3.7 m sidewater depth in 30.5 m diameter units), indicated that a significant increase in capacity from 416 000 to 684 000 m3/d or more was possible by adding step feed capabilities to the existing bioreactors and upgrading the existing secondary clarifiers. One of the six treatment units at the WRF was modified, and an extensive 2-year testing program was conducted to determine the total peak wet weather secondary treatment capacity achievable. The results demonstrated that a peak wet weather secondary treatment capacity approaching 974 000 m3/d is possible as long as secondary clarifier solids and hydraulic loadings could be separately controlled using the step feed capability provided. Excellent sludge settling characteristics are routinely experienced at the City of Akron WRF, raising concerns that the identified peak wet weather secondary treatment capacity could not be maintained should sludge settling characteristics deteriorate for some reason. Computational fluid dynamics analysis indicated that the impact of the deterioration of sludge settling characteristics could be mitigated and the identified peak wet weather secondary treatment capacity maintained by further use of the step feed capability provided to further reduce secondary clarifier solids loading rates at the identified high surface overflow rates. The results also demonstrated that effluent limits not only for total suspended solids (TSS) and five-day carbonaceous biochemical oxygen demand (cBOD5) could be

  4. To improve the quality of the statistical analysis of papers published in the Journal of the Korean Society for Therapeutic Radiology and Oncology

    International Nuclear Information System (INIS)

    Park, Hee Chul; Choi, Doo Ho; Ahn, Song Vogue

    2008-01-01

    To improve the quality of the statistical analysis of papers published in the Journal of the Korean Society for Therapeutic Radiology and Oncology (JKOSTRO) by evaluating commonly encountered errors. Materials and Methods: Papers published in the JKOSTRO from January 2006 to December 2007 were reviewed for methodological and statistical validity using a modified version of Ahn's checklist. A statistician reviewed individual papers and evaluated the list items in the checklist for each paper. To avoid the potential assessment error by the statistician who lacks expertise in the field of radiation oncology; the editorial board of the JKOSTRO reviewed each checklist for individual articles. A frequency analysis of the list items was performed using SAS (version 9.0, SAS Institute, NC, USA) software. Results: A total of 73 papers including 5 case reports and 68 original articles were reviewed. Inferential statistics was used in 46 papers. The most commonly adopted statistical methodology was a survival analysis (58.7%). Only 19% of papers were free of statistical errors. Errors of omission were encountered in 34 (50.0%) papers. Errors of commission were encountered in 35 (51.5%) papers. Twenty-one papers (30.9%) had both errors of omission and commission. Conclusion: A variety of statistical errors were encountered in papers published in the JKOSTRO. The current study suggests that a more thorough review of the statistical analysis is needed for manuscripts submitted in the JKOSTRO

  5. Comparing identified and statistically significant lipids and polar metabolites in 15-year old serum and dried blood spot samples for longitudinal studies: Comparing lipids and metabolites in serum and DBS samples

    Energy Technology Data Exchange (ETDEWEB)

    Kyle, Jennifer E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Casey, Cameron P. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Stratton, Kelly G. [National Security Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zink, Erika M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Kim, Young-Mo [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zheng, Xueyun [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Weitz, Karl K. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Bloodsworth, Kent J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Orton, Daniel J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Ibrahim, Yehia M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Moore, Ronald J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Lee, Christine G. [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Research Service, Portland Veterans Affairs Medical Center, Portland OR USA; Pedersen, Catherine [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Orwoll, Eric [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Smith, Richard D. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Burnum-Johnson, Kristin E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Baker, Erin S. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA

    2017-02-05

    The use of dried blood spots (DBS) has many advantages over traditional plasma and serum samples such as smaller blood volume required, storage at room temperature, and ability for sampling in remote locations. However, understanding the robustness of different analytes in DBS samples is essential, especially in older samples collected for longitudinal studies. Here we analyzed DBS samples collected in 2000-2001 and stored at room temperature and compared them to matched serum samples stored at -80°C to determine if they could be effectively used as specific time points in a longitudinal study following metabolic disease. Four hundred small molecules were identified in both the serum and DBS samples using gas chromatograph-mass spectrometry (GC-MS), liquid chromatography-MS (LC-MS) and LC-ion mobility spectrometry-MS (LC-IMS-MS). The identified polar metabolites overlapped well between the sample types, though only one statistically significant polar metabolite in a case-control study was conserved, indicating degradation occurs in the DBS samples affecting quantitation. Differences in the lipid identifications indicated that some oxidation occurs in the DBS samples. However, thirty-six statistically significant lipids correlated in both sample types indicating that lipid quantitation was more stable across the sample types.

  6. Ten Ways to Improve the Use of Statistical Mediation Analysis in the Practice of Child and Adolescent Treatment Research

    Science.gov (United States)

    Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.

    2012-01-01

    Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…

  7. Ten ways to improve the use of statistical mediation analysis in the practice of child and adolescent treatment research

    NARCIS (Netherlands)

    Maric, M.; Wiers, R.W.; Prins, P.J.M.

    2012-01-01

    Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that have reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data

  8. Personalizing oncology treatments by predicting drug efficacy, side-effects, and improved therapy: mathematics, statistics, and their integration.

    Science.gov (United States)

    Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri

    2014-01-01

    Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.

  9. IMPROVING INTERFEROMETRIC NULL DEPTH MEASUREMENTS USING STATISTICAL DISTRIBUTIONS: THEORY AND FIRST RESULTS WITH THE PALOMAR FIBER NULLER

    International Nuclear Information System (INIS)

    Hanot, C.; Riaud, P.; Absil, O.; Mennesson, B.; Martin, S.; Liewer, K.; Loya, F.; Mawet, D.; Serabyn, E.

    2011-01-01

    A new 'self-calibrated' statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10 -4 can be obtained in the near-infrared, which means that null depths lower than 10 -3 can be reliably measured. This statistical analysis is not specific to our instrument and may be applicable to other interferometers.

  10. Las pruebas de significación estadística en tres revistas biomédicas: una revisión crítica Tests of statistical significance in three biomedical journals: a critical review

    Directory of Open Access Journals (Sweden)

    Madelaine Sarria Castro

    2004-05-01

    Full Text Available OBJETIVOS: Caracterizar el empleo de las pruebas convencionales de significación estadística y las tendencias actuales que muestra su uso en tres revistas biomédicas del ámbito hispanohablante. MÉTODOS: Se examinaron todos los artículos originales descriptivos o explicativos que fueron publicados en el quinquenio de 1996­2000 en tres publicaciones: Revista Cubana de Medicina General Integral, Revista Panamericana de Salud Pública/Pan American Journal of Public Health y Medicina Clínica. RESULTADOS: En las tres revistas examinadas se detectaron diversos rasgos criticables en el empleo de las pruebas de hipótesis basadas en los "valores P" y la escasa presencia de las nuevas tendencias que se proponen en su lugar: intervalos de confianza (IC e inferencia bayesiana. Los hallazgos fundamentales fueron los siguientes: mínima presencia de los IC, ya fuese como complemento de las pruebas de significación o como recurso estadístico único; mención del tamaño muestral como posible explicación de los resultados; predominio del empleo de valores rígidos de alfa; falta de uniformidad en la presentación de los resultados, y alusión indebida en las conclusiones de la investigación a los resultados de las pruebas de hipótesis. CONCLUSIONES: Los resultados reflejan la falta de acatamiento de autores y editores en relación con las normas aceptadas en torno al uso de las pruebas de significación estadística y apuntan a que el empleo adocenado de estas pruebas sigue ocupando un espacio importante en la literatura biomédica del ámbito hispanohablante.OBJECTIVE: To describe the use of conventional tests of statistical significance and the current trends shown by their use in three biomedical journals read in Spanish-speaking countries. METHODS: All descriptive or explanatory original articles published in the five-year period of 1996 through 2000 were reviewed in three journals: Revista Cubana de Medicina General Integral [Cuban Journal of

  11. Five-year results from a prospective multicentre study of percutaneous pulmonary valve implantation demonstrate sustained removal of significant pulmonary regurgitation, improved right ventricular outflow tract obstruction and improved quality of life

    DEFF Research Database (Denmark)

    Hager, Alfred; Schubert, Stephan; Ewert, Peter

    2017-01-01

    . The EQ-5D quality of life utility index and visual analogue scale scores were both significantly improved six months post PPVI and remained so at five years. CONCLUSIONS: Five-year results following PPVI demonstrate resolved moderate or severe pulmonary regurgitation, improved right ventricular outflow...

  12. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  13. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  14. Addition of 2-(ethylamino)acetonitrile group to nitroxoline results in significantly improved anti-tumor activity in vitro and in vivo.

    Science.gov (United States)

    Mitrović, Ana; Sosič, Izidor; Kos, Špela; Tratar, Urša Lampreht; Breznik, Barbara; Kranjc, Simona; Mirković, Bojana; Gobec, Stanislav; Lah, Tamara; Serša, Gregor; Kos, Janko

    2017-08-29

    Lysosomal cysteine peptidase cathepsin B, involved in multiple processes associated with tumor progression, is validated as a target for anti-cancer therapy. Nitroxoline, a known antimicrobial agent, is a potent and selective inhibitor of cathepsin B, hence reducing tumor progression in vitro and in vivo . In order to further improve its anti-cancer properties we developed a number of derivatives using structure-based chemical synthesis. Of these, the 7-aminomethylated derivative (compound 17 ) exhibited significantly improved kinetic properties over nitroxoline, inhibiting cathepsin B endopeptidase activity selectively. In the present study, we have evaluated its anti-cancer properties. It was more effective than nitroxoline in reducing tumor cell invasion and migration, as determined in vitro on two-dimensional cell models and tumor spheroids, under either endpoint or real time conditions. Moreover, it exhibited improved action over nitroxoline in impairing tumor growth in vivo in LPB mouse fibrosarcoma tumors in C57Bl/6 mice. Taken together, the addition of a 2-(ethylamino)acetonitrile group to nitroxoline at position 7 significantly improves its pharmacological characteristics and its potential for use as an anti-cancer drug.

  15. Improved Noise Minimum Statistics Estimation Algorithm for Using in a Speech-Passing Noise-Rejecting Headset

    Directory of Open Access Journals (Sweden)

    Seyedtabaee Saeed

    2010-01-01

    Full Text Available This paper deals with configuration of an algorithm to be used in a speech-passing angle grinder noise-canceling headset. Angle grinder noise is annoying and interrupts ordinary oral communication. Meaning that, low SNR noisy condition is ahead. Since variation in angle grinder working condition changes noise statistics, the noise will be nonstationary with possible jumps in its power. Studies are conducted for picking an appropriate algorithm. A modified version of the well-known spectral subtraction shows superior performance against alternate methods. Noise estimation is calculated through a multi-band fast adapting scheme. The algorithm is adapted very quickly to the non-stationary noise environment while inflecting minimum musical noise and speech distortion on the processed signal. Objective and subjective measures illustrating the performance of the proposed method are introduced.

  16. Improving thermal model prediction through statistical analysis of irradiation and post-irradiation data from AGR experiments

    International Nuclear Information System (INIS)

    Pham, Binh T.; Hawkes, Grant L.; Einerson, Jeffrey J.

    2014-01-01

    As part of the High Temperature Reactors (HTR) R and D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test

  17. Improving thermal model prediction through statistical analysis of irradiation and post-irradiation data from AGR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov [Human Factor, Controls and Statistics Department, Nuclear Science and Technology, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Hawkes, Grant L. [Thermal Science and Safety Analysis Department, Nuclear Science and Technology, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Einerson, Jeffrey J. [Human Factor, Controls and Statistics Department, Nuclear Science and Technology, Idaho National Laboratory, Idaho Falls, ID 83415 (United States)

    2014-05-01

    As part of the High Temperature Reactors (HTR) R and D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.

  18. Statistically optimal estimation of Greenland Ice Sheet mass variations from GRACE monthly solutions using an improved mascon approach

    NARCIS (Netherlands)

    Ran, J.; Ditmar, P.G.; Klees, R.; Farahani, H.

    2017-01-01

    We present an improved mascon approach to transform monthly spherical harmonic solutions based on GRACE satellite data into mass anomaly estimates in Greenland. The GRACE-based spherical harmonic coefficients are used to synthesize gravity anomalies at satellite altitude, which are then inverted

  19. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  20. Co-overexpressing a Plasma Membrane and a Vacuolar Membrane Sodium/Proton Antiporter Significantly Improves Salt Tolerance in Transgenic Arabidopsis Plants

    Science.gov (United States)

    Pehlivan, Necla; Sun, Li; Jarrett, Philip; Yang, Xiaojie; Mishra, Neelam; Chen, Lin; Kadioglu, Asim; Shen, Guoxin; Zhang, Hong

    2016-01-01

    The Arabidopsis gene AtNHX1 encodes a vacuolar membrane-bound sodium/proton (Na+/H+) antiporter that transports Na+ into the vacuole and exports H+ into the cytoplasm. The Arabidopsis gene SOS1 encodes a plasma membrane-bound Na+/H+ antiporter that exports Na+ to the extracellular space and imports H+ into the plant cell. Plants rely on these enzymes either to keep Na+ out of the cell or to sequester Na+ into vacuoles to avoid the toxic level of Na+ in the cytoplasm. Overexpression of AtNHX1 or SOS1 could improve salt tolerance in transgenic plants, but the improved salt tolerance is limited. NaCl at concentration >200 mM would kill AtNHX1-overexpressing or SOS1-overexpressing plants. Here it is shown that co-overexpressing AtNHX1 and SOS1 could further improve salt tolerance in transgenic Arabidopsis plants, making transgenic Arabidopsis able to tolerate up to 250 mM NaCl treatment. Furthermore, co-overexpression of AtNHX1 and SOS1 could significantly reduce yield loss caused by the combined stresses of heat and salt, confirming the hypothesis that stacked overexpression of two genes could substantially improve tolerance against multiple stresses. This research serves as a proof of concept for improving salt tolerance in other plants including crops. PMID:26985021

  1. Co-overexpressing a Plasma Membrane and a Vacuolar Membrane Sodium/Proton Antiporter Significantly Improves Salt Tolerance in Transgenic Arabidopsis Plants.

    Science.gov (United States)

    Pehlivan, Necla; Sun, Li; Jarrett, Philip; Yang, Xiaojie; Mishra, Neelam; Chen, Lin; Kadioglu, Asim; Shen, Guoxin; Zhang, Hong

    2016-05-01

    The Arabidopsis gene AtNHX1 encodes a vacuolar membrane-bound sodium/proton (Na(+)/H(+)) antiporter that transports Na(+) into the vacuole and exports H(+) into the cytoplasm. The Arabidopsis gene SOS1 encodes a plasma membrane-bound Na(+)/H(+) antiporter that exports Na(+) to the extracellular space and imports H(+) into the plant cell. Plants rely on these enzymes either to keep Na(+) out of the cell or to sequester Na(+) into vacuoles to avoid the toxic level of Na(+) in the cytoplasm. Overexpression of AtNHX1 or SOS1 could improve salt tolerance in transgenic plants, but the improved salt tolerance is limited. NaCl at concentration >200 mM would kill AtNHX1-overexpressing or SOS1-overexpressing plants. Here it is shown that co-overexpressing AtNHX1 and SOS1 could further improve salt tolerance in transgenic Arabidopsis plants, making transgenic Arabidopsis able to tolerate up to 250 mM NaCl treatment. Furthermore, co-overexpression of AtNHX1 and SOS1 could significantly reduce yield loss caused by the combined stresses of heat and salt, confirming the hypothesis that stacked overexpression of two genes could substantially improve tolerance against multiple stresses. This research serves as a proof of concept for improving salt tolerance in other plants including crops. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  2. Interventions to significantly improve service uptake and retention of HIV-positive pregnant women and HIV-exposed infants along the prevention of mother-to-child transmission continuum of care: systematic review.

    Science.gov (United States)

    Vrazo, Alexandra C; Firth, Jacqueline; Amzel, Anouk; Sedillo, Rebecca; Ryan, Julia; Phelps, B Ryan

    2018-02-01

    Despite the success of Prevention of Mother-to-Child Transmission of HIV (PMTCT) programmes, low uptake of services and poor retention pose a formidable challenge to achieving the elimination of vertical HIV transmission in low- and middle-income countries. This systematic review summarises interventions that demonstrate statistically significant improvements in service uptake and retention of HIV-positive pregnant and breastfeeding women and their infants along the PMTCT cascade. Databases were systematically searched for peer-reviewed studies. Outcomes of interest included uptake of services, such as antiretroviral therapy (ART) such as initiation, early infant diagnostic testing, and retention of HIV-positive pregnant and breastfeeding women and their infants. Interventions that led to statistically significant outcomes were included and mapped to the PMTCT cascade. An eight-item assessment tool assessed study rigour. CRD42017063816. Of 686 citations reviewed, 11 articles met inclusion criteria. Ten studies detailed maternal outcomes and seven studies detailed infant outcomes in PMTCT programmes. Interventions to increase access to antenatal care (ANC) and ART services (n = 4) and those using lay cadres (n = 3) were most common. Other interventions included quality improvement (n = 2), mHealth (n = 1), and counselling (n = 1). One study described interventions in an Option B+ programme. Limitations included lack of HIV testing and counselling and viral load monitoring outcomes, small sample size, geographical location, and non-randomized assignment and selection of participants. Interventions including ANC/ART integration, family-centred approaches, and the use of lay healthcare providers are demonstrably effective in increasing service uptake and retention of HIV-positive mothers and their infants in PMTCT programmes. Future studies should include control groups and assess whether interventions developed in the context of earlier 'Options' are

  3. Assessment of noise reduction potential and image quality improvement of a new generation adaptive statistical iterative reconstruction (ASIR-V) in chest CT.

    Science.gov (United States)

    Tang, Hui; Yu, Nan; Jia, Yongjun; Yu, Yong; Duan, Haifeng; Han, Dong; Ma, Guangming; Ren, Chenglong; He, Taiping

    2018-01-01

    To evaluate the image quality improvement and noise reduction in routine dose, non-enhanced chest CT imaging by using a new generation adaptive statistical iterative reconstruction (ASIR-V) in comparison with ASIR algorithm. 30 patients who underwent routine dose, non-enhanced chest CT using GE Discovery CT750HU (GE Healthcare, Waukesha, WI) were included. The scan parameters included tube voltage of 120 kVp, automatic tube current modulation to obtain a noise index of 14HU, rotation speed of 0.6 s, pitch of 1.375:1 and slice thickness of 5 mm. After scanning, all scans were reconstructed with the recommended level of 40%ASIR for comparison purpose and different percentages of ASIR-V from 10% to 100% in a 10% increment. The CT attenuation values and SD of the subcutaneous fat, back muscle and descending aorta were measured at the level of tracheal carina of all reconstructed images. The signal-to-noise ratio (SNR) was calculated with SD representing image noise. The subjective image quality was independently evaluated by two experienced radiologists. For all ASIR-V images, the objective image noise (SD) of fat, muscle and aorta decreased and SNR increased along with increasing ASIR-V percentage. The SD of 30% ASIR-V to 100% ASIR-V was significantly lower than that of 40% ASIR (p ASIR-V reconstructions had good diagnostic acceptability. However, the 50% ASIR-V to 70% ASIR-V series showed significantly superior visibility of small structures when compared with the 40% ASIR and ASIR-V of other percentages (p ASIR-V was the best series of all ASIR-V images, with a highest subjective image quality. The image sharpness was significantly decreased in images reconstructed by 80% ASIR-V and higher. In routine dose, non-enhanced chest CT, ASIR-V shows greater potential in reducing image noise and artefacts and maintaining image sharpness when compared to the recommended level of 40%ASIR algorithm. Combining both the objective and subjective evaluation of images, non

  4. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  5. Particle System Based Adaptive Sampling on Spherical Parameter Space to Improve the MDL Method for Construction of Statistical Shape Models

    Directory of Open Access Journals (Sweden)

    Rui Xu

    2013-01-01

    Full Text Available Minimum description length (MDL based group-wise registration was a state-of-the-art method to determine the corresponding points of 3D shapes for the construction of statistical shape models (SSMs. However, it suffered from the problem that determined corresponding points did not uniformly spread on original shapes, since corresponding points were obtained by uniformly sampling the aligned shape on the parameterized space of unit sphere. We proposed a particle-system based method to obtain adaptive sampling positions on the unit sphere to resolve this problem. Here, a set of particles was placed on the unit sphere to construct a particle system whose energy was related to the distortions of parameterized meshes. By minimizing this energy, each particle was moved on the unit sphere. When the system became steady, particles were treated as vertices to build a spherical mesh, which was then relaxed to slightly adjust vertices to obtain optimal sampling-positions. We used 47 cases of (left and right lungs and 50 cases of livers, (left and right kidneys, and spleens for evaluations. Experiments showed that the proposed method was able to resolve the problem of the original MDL method, and the proposed method performed better in the generalization and specificity tests.

  6. Vitamin D and Calcium Addition during Denosumab Therapy over a Period of Four Years Significantly Improves Lumbar Bone Mineral Density in Japanese Osteoporosis Patients

    Directory of Open Access Journals (Sweden)

    Takako Suzuki

    2018-02-01

    Full Text Available This study investigated whether or not vitamin D and calcium supplementation affected bone metabolism and bone mineral density (BMD over a period of four years of denosumab therapy in patients with primary osteoporosis. Patients were divided into a denosumab monotherapy group (22 cases or a denosumab plus vitamin D and calcium supplementation group (combination group, 21 cases. We measured serum bone alkaline phosphatase (BAP, tartrate-resistant acid phosphatase (TRACP-5b, urinary N-terminal telopeptide of type-I collagen (NTX, and BMD of the lumbar 1–4 vertebrae (L-BMD and bilateral hips (H-BMD at baseline and at 12, 24, 36, and 48 months of treatment. There were no significant differences in patient background. Serum BAP, TRACP-5b, and urinary NTX were significantly and comparably inhibited in both groups from 12 to 48 months versus baseline values. L-BMD was significantly increased at every time point in both groups, while H-BMD was significantly increased at every time point in the combination group only. There were significant differences between the groups for L-BMD at 24, 36, and 48 months (P < 0.05 and for H-BMD at 12 months (P < 0.05. Compared with denosumab monotherapy, combination therapy of denosumab plus vitamin D and calcium significantly increased H-BMD at 12 months and L-BMD from 24 to 48 months. These findings indicate that continuous vitamin D and calcium supplementation is important, especially for 12 months to improve H-BMD and from 24 to 48 months to improve L-BMD.

  7. Significant improvement of olfactory performance in sleep apnea patients after three months of nasal CPAP therapy - Observational study and randomized trial.

    Directory of Open Access Journals (Sweden)

    Bettina Boerner

    Full Text Available The olfactory function highly impacts quality of life (QoL. Continuous positive airway pressure is an effective treatment for obstructive sleep apnea (OSA and is often applied by nasal masks (nCPAP. The influence of nCPAP on the olfactory performance of OSA patients is unknown. The aim of this study was to assess the sense of smell before initiation of nCPAP and after three months treatment, in moderate and severe OSA patients.The sense of smell was assessed in 35 patients suffering from daytime sleepiness and moderate to severe OSA (apnea/hypopnea index ≥ 15/h, with the aid of a validated test battery (Sniffin' Sticks before initiation of nCPAP therapy and after three months of treatment. Additionally, adherent subjects were included in a double-blind randomized three weeks CPAP-withdrawal trial (sub-therapeutic CPAP pressure.Twenty five of the 35 patients used the nCPAP therapy for more than four hours per night, and for more than 70% of nights (adherent group. The olfactory performance of these patients improved significantly (p = 0.007 after three months of nCPAP therapy. When considering the entire group of patients, olfaction also improved significantly (p = 0.001. In the randomized phase the sense of smell of six patients deteriorated under sub-therapeutic CPAP pressure (p = 0.046 whereas five patients in the maintenance CPAP group showed no significant difference (p = 0.501.Olfactory performance improved significantly after three months of nCPAP therapy in patients suffering from moderate and severe OSA. It seems that this effect of nCPAP is reversible under sub-therapeutic CPAP pressure.ISRCTN11128866.

  8. Effects of a community scorecard on improving the local health system in Eastern Democratic Republic of Congo: qualitative evidence using the most significant change technique.

    Science.gov (United States)

    Ho, Lara S; Labrecque, Guillaume; Batonon, Isatou; Salsi, Viviana; Ratnayake, Ruwan

    2015-01-01

    More than a decade of conflict has weakened the health system in the Democratic Republic of Congo and decreased its ability to respond to the needs of the population. Community scorecards have been conceived as a way to increase accountability and responsiveness of service providers, but there is limited evidence of their effects, particularly in fragile and conflict-affected contexts. This paper describes the implementation of community scorecards within a community-driven reconstruction project in two provinces of eastern Democratic Republic of Congo. Between June 2012 and November 2013, 45 stories of change in the health system were collected from village development committee, health committee, community members (20 men and 18 women) and healthcare providers (n = 7) in 25 sites using the Most Significant Change technique. Stories were analyzed qualitatively for content related to the types and mechanisms of change observed. The most salient changes were related to increased transparency and community participation in health facility management, and improved quality of care. Quality of care included increased access to services, improved patient-provider relationships, improved performance of service providers, and improved maintenance of physical infrastructure. Changes occurred through many different mechanisms including provider actions in response to information, pressure from community representatives, or supervisors; and joint action and improved collaboration by health facility committees and providers. Although it is often assumed that confrontation is a primary mechanism for citizens to change state-provided services, this study demonstrates that healthcare providers may also be motivated to change through other means. Positive experiences of community scorecards can provide a structured space for interface between community members and the health system, allowing users to voice their opinions and preferences and bridge information gaps for both

  9. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  10. PolySearch2: a significantly improved text-mining system for discovering associations between human diseases, genes, drugs, metabolites, toxins and more.

    Science.gov (United States)

    Liu, Yifeng; Liang, Yongjie; Wishart, David

    2015-07-01

    PolySearch2 (http://polysearch.ca) is an online text-mining system for identifying relationships between biomedical entities such as human diseases, genes, SNPs, proteins, drugs, metabolites, toxins, metabolic pathways, organs, tissues, subcellular organelles, positive health effects, negative health effects, drug actions, Gene Ontology terms, MeSH terms, ICD-10 medical codes, biological taxonomies and chemical taxonomies. PolySearch2 supports a generalized 'Given X, find all associated Ys' query, where X and Y can be selected from the aforementioned biomedical entities. An example query might be: 'Find all diseases associated with Bisphenol A'. To find its answers, PolySearch2 searches for associations against comprehensive collections of free-text collections, including local versions of MEDLINE abstracts, PubMed Central full-text articles, Wikipedia full-text articles and US Patent application abstracts. PolySearch2 also searches 14 widely used, text-rich biological databases such as UniProt, DrugBank and Human Metabolome Database to improve its accuracy and coverage. PolySearch2 maintains an extensive thesaurus of biological terms and exploits the latest search engine technology to rapidly retrieve relevant articles and databases records. PolySearch2 also generates, ranks and annotates associative candidates and present results with relevancy statistics and highlighted key sentences to facilitate user interpretation. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Building information for systematic improvement of the prevention of hospital-acquired pressure ulcers with statistical process control charts and regression.

    Science.gov (United States)

    Padula, William V; Mishra, Manish K; Weaver, Christopher D; Yilmaz, Taygan; Splaine, Mark E

    2012-06-01

    To demonstrate complementary results of regression and statistical process control (SPC) chart analyses for hospital-acquired pressure ulcers (HAPUs), and identify possible links between changes and opportunities for improvement between hospital microsystems and macrosystems. Ordinary least squares and panel data regression of retrospective hospital billing data, and SPC charts of prospective patient records for a US tertiary-care facility (2004-2007). A prospective cohort of hospital inpatients at risk for HAPUs was the study population. There were 337 HAPU incidences hospital wide among 43 844 inpatients. A probit regression model predicted the correlation of age, gender and length of stay on HAPU incidence (pseudo R(2)=0.096). Panel data analysis determined that for each additional day in the hospital, there was a 0.28% increase in the likelihood of HAPU incidence. A p-chart of HAPU incidence showed a mean incidence rate of 1.17% remaining in statistical control. A t-chart showed the average time between events for the last 25 HAPUs was 13.25 days. There was one 57-day period between two incidences during the observation period. A p-chart addressing Braden scale assessments showed that 40.5% of all patients were risk stratified for HAPUs upon admission. SPC charts complement standard regression analysis. SPC amplifies patient outcomes at the microsystem level and is useful for guiding quality improvement. Macrosystems should monitor effective quality improvement initiatives in microsystems and aid the spread of successful initiatives to other microsystems, followed by system-wide analysis with regression. Although HAPU incidence in this study is below the national mean, there is still room to improve HAPU incidence in this hospital setting since 0% incidence is theoretically achievable. Further assessment of pressure ulcer incidence could illustrate improvement in the quality of care and prevent HAPUs.

  12. A facile template method to synthesize significantly improved LiNi0.5Mn1.5O4 using corn stalk as a bio-template

    International Nuclear Information System (INIS)

    Liu, Guiyang; Kong, Xin; Sun, Hongyan; Wang, Baosen; Yi, Zhongzhou; Wang, Quanbiao

    2014-01-01

    In order to simplify the template method for the synthesis of cathode materials for lithium ion batteries, a facile template method using plant stalks as bio-templates has been introduced. Based on this method, LiNi 0.5 Mn 1.5 O 4 spinel with a significantly improved electrochemical performance has been synthesized using corn stalk as a template. X-ray diffraction (XRD), Fourier transform infrared pectroscopy (FTIR) and scanning electron microscope (SEM) have been used to investigate the phase composition and micro-morphologies of the products. Charge-discharge measurements in lithium cells, cyclic voltammetry (CV) and Electrochemical impedance spectroscopy (EIS) have been used to study the electrochemical performance of the products. The results indicate that the templated product exhibits higher crystallinity than that of non-templated product. Both of the templated product and the non-templated product are combination of the ordered space group P4 3 32 and the disordered Fd-3 m. The specific BET surface area of the templated product is about twice larger than that of the non-templated product. Moreover, the electrochemical performances of the templated product including specific capacity, cycling stability and rate capability are significantly improved as compared with the non-templated product, due to its higher crystallinity, larger Li + diffusion coefficient and lower charge transfer resistance

  13. Transplantation of N-Acetyl Aspartyl-Glutamate Synthetase-Activated Neural Stem Cells after Experimental Traumatic Brain Injury Significantly Improves Neurological Recovery

    Directory of Open Access Journals (Sweden)

    Mingfeng Li

    2013-12-01

    Full Text Available Background/Aims: Neural stem cells (NSCs hold considerable potential as a therapeutic tool for repair of the damaged nervous system. In the current study, we examined whether transplanted N-acetyl aspartyl-glutamate synthetase (NAAGS-activated NSCs (NAAGS/NSCs further improve neurological recovery following traumatic brain injury (TBI in Sprague-Dawley rats. Methods: Animals received TBI and stereotactic injection of NSCs, NAAGS/NSCs or phosphate buffered saline without cells (control into the injured cortex. NAAGS protein expression was detected through western blot analysis. Dialysate NAAG levels were analyzed with radioimmunoassay. Cell apoptosis was detected via TUNEL staining. The expression levels of specific pro-inflammatory cytokines were detected with enzyme-linked immunosorbent assay. Results: Groups with transplanted NSCs and NAAGS/NSCs displayed significant recovery of the motor behavior, compared to the control group. At 14 and 21 days post-transplantation, the motor behavior in NAAGS/NSC group was significantly improved than that in NSC group (pConclusion: Our results collectively demonstrate that NAAGS/NSCs provide a more powerful autoplastic therapy for the injured nervous system.

  14. A quantitative analysis of statistical power identifies obesity end points for improved in vivo preclinical study design.

    Science.gov (United States)

    Selimkhanov, J; Thompson, W C; Guo, J; Hall, K D; Musante, C J

    2017-08-01

    The design of well-powered in vivo preclinical studies is a key element in building the knowledge of disease physiology for the purpose of identifying and effectively testing potential antiobesity drug targets. However, as a result of the complexity of the obese phenotype, there is limited understanding of the variability within and between study animals of macroscopic end points such as food intake and body composition. This, combined with limitations inherent in the measurement of certain end points, presents challenges to study design that can have significant consequences for an antiobesity program. Here, we analyze a large, longitudinal study of mouse food intake and body composition during diet perturbation to quantify the variability and interaction of the key metabolic end points. To demonstrate how conclusions can change as a function of study size, we show that a simulated preclinical study properly powered for one end point may lead to false conclusions based on secondary end points. We then propose the guidelines for end point selection and study size estimation under different conditions to facilitate proper power calculation for a more successful in vivo study design.

  15. Improving sensitivity of linear regression-based cell type-specific differential expression deconvolution with per-gene vs. global significance threshold.

    Science.gov (United States)

    Glass, Edmund R; Dozmorov, Mikhail G

    2016-10-06

    The goal of many human disease-oriented studies is to detect molecular mechanisms different between healthy controls and patients. Yet, commonly used gene expression measurements from blood samples suffer from variability of cell composition. This variability hinders the detection of differentially expressed genes and is often ignored. Combined with cell counts, heterogeneous gene expression may provide deeper insights into the gene expression differences on the cell type-specific level. Published computational methods use linear regression to estimate cell type-specific differential expression, and a global cutoff to judge significance, such as False Discovery Rate (FDR). Yet, they do not consider many artifacts hidden in high-dimensional gene expression data that may negatively affect linear regression. In this paper we quantify the parameter space affecting the performance of linear regression (sensitivity of cell type-specific differential expression detection) on a per-gene basis. We evaluated the effect of sample sizes, cell type-specific proportion variability, and mean squared error on sensitivity of cell type-specific differential expression detection using linear regression. Each parameter affected variability of cell type-specific expression estimates and, subsequently, the sensitivity of differential expression detection. We provide the R package, LRCDE, which performs linear regression-based cell type-specific differential expression (deconvolution) detection on a gene-by-gene basis. Accounting for variability around cell type-specific gene expression estimates, it computes per-gene t-statistics of differential detection, p-values, t-statistic-based sensitivity, group-specific mean squared error, and several gene-specific diagnostic metrics. The sensitivity of linear regression-based cell type-specific differential expression detection differed for each gene as a function of mean squared error, per group sample sizes, and variability of the proportions

  16. Mobile physician reporting of clinically significant events-a novel way to improve handoff communication and supervision of resident on call activities.

    Science.gov (United States)

    Nabors, Christopher; Peterson, Stephen J; Aronow, Wilbert S; Sule, Sachin; Mumtaz, Arif; Shah, Tushar; Eskridge, Etta; Wold, Eric; Stallings, Gary W; Burak, Kathleen Kelly; Goldberg, Randy; Guo, Gary; Sekhri, Arunabh; Mathew, George; Khera, Sahil; Montoya, Jessica; Sharma, Mala; Paudel, Rajiv; Frishman, William H

    2014-12-01

    Reporting of clinically significant events represents an important mechanism by which patient safety problems may be identified and corrected. However, time pressure and cumbersome report entry procedures have discouraged the full participation of physicians. To improve the process, our internal medicine training program developed an easy-to-use mobile platform that combines the reporting process with patient sign-out. Between August 25, 2011, and January 25, 2012, our trainees entered clinically significant events into i-touch/i-phone/i-pad based devices functioning in wireless-synchrony with our desktop application. Events were collected into daily reports that were sent from the handoff system to program leaders and attending physicians to plan for rounds and to correct safety problems. Using the mobile module, residents entered 31 reportable events per month versus the 12 events per month that were reported via desktop during a previous 6-month study period. Advances in information technology now permit clinically significant events that take place during "off hours" to be identified and reported (via handoff) to next providers and to supervisors via collated reports. This information permits hospital leaders to correct safety issues quickly and effectively, while attending physicians are able to use information gleaned from the reports to optimize rounding plans and to provide additional oversight of trainee on call patient management decisions.

  17. Improved estimation of the noncentrality parameter distribution from a large number of t-statistics, with applications to false discovery rate estimation in microarray data analysis.

    Science.gov (United States)

    Qu, Long; Nettleton, Dan; Dekkers, Jack C M

    2012-12-01

    Given a large number of t-statistics, we consider the problem of approximating the distribution of noncentrality parameters (NCPs) by a continuous density. This problem is closely related to the control of false discovery rates (FDR) in massive hypothesis testing applications, e.g., microarray gene expression analysis. Our methodology is similar to, but improves upon, the existing approach by Ruppert, Nettleton, and Hwang (2007, Biometrics, 63, 483-495). We provide parametric, nonparametric, and semiparametric estimators for the distribution of NCPs, as well as estimates of the FDR and local FDR. In the parametric situation, we assume that the NCPs follow a distribution that leads to an analytically available marginal distribution for the test statistics. In the nonparametric situation, we use convex combinations of basis density functions to estimate the density of the NCPs. A sequential quadratic programming procedure is developed to maximize the penalized likelihood. The smoothing parameter is selected with the approximate network information criterion. A semiparametric estimator is also developed to combine both parametric and nonparametric fits. Simulations show that, under a variety of situations, our density estimates are closer to the underlying truth and our FDR estimates are improved compared with alternative methods. Data-based simulations and the analyses of two microarray datasets are used to evaluate the performance in realistic situations. © 2012, The International Biometric Society.

  18. An Unsupervised Method of Change Detection in Multi-Temporal PolSAR Data Using a Test Statistic and an Improved K&I Algorithm

    Directory of Open Access Journals (Sweden)

    Jinqi Zhao

    2017-12-01

    Full Text Available In recent years, multi-temporal imagery from spaceborne sensors has provided a fast and practical means for surveying and assessing changes in terrain surfaces. Owing to the all-weather imaging capability, polarimetric synthetic aperture radar (PolSAR has become a key tool for change detection. Change detection methods include both unsupervised and supervised methods. Supervised change detection, which needs some human intervention, is generally ineffective and impractical. Due to this limitation, unsupervised methods are widely used in change detection. The traditional unsupervised methods only use a part of the polarization information, and the required thresholding algorithms are independent of the multi-temporal data, which results in the change detection map being ineffective and inaccurate. To solve these problems, a novel method of change detection using a test statistic based on the likelihood ratio test and the improved Kittler and Illingworth (K&I minimum-error thresholding algorithm is introduced in this paper. The test statistic is used to generate the comparison image (CI of the multi-temporal PolSAR images, and improved K&I using a generalized Gaussian model simulates the distribution of the CI. As a result of these advantages, we can obtain the change detection map using an optimum threshold. The efficiency of the proposed method is demonstrated by the use of multi-temporal PolSAR images acquired by RADARSAT-2 over Wuhan, China. The experimental results show that the proposed method is effective and highly accurate.

  19. Scissor-type knife significantly improves self-completion rate of colorectal endoscopic submucosal dissection: Single-center prospective randomized trial.

    Science.gov (United States)

    Yamashina, Takeshi; Takeuchi, Yoji; Nagai, Kengo; Matsuura, Noriko; Ito, Takashi; Fujii, Mototsugu; Hanaoka, Noboru; Higashino, Koji; Uedo, Noriya; Ishihara, Ryu; Iishi, Hiroyasu

    2017-05-01

    Colorectal endoscopic submucosal dissection (C-ESD) is recognized as a difficult procedure. Recently, scissors-type knives were launched to reduce the difficulty of C-ESD. The aim of this study was to evaluate the efficacy and safety of the combined use of a scissors-type knife and a needle-type knife with a water-jet function (WJ needle-knife) for C-ESD compared with using the WJ needle-knife alone. This was a prospective randomized controlled trial in a referral center. Eighty-five patients with superficial colorectal neoplasms were enrolled and randomly assigned to undergo C-ESD using a WJ needle-knife alone (Flush group) or a scissor-type knife-supported WJ needle-knife (SB Jr group). Procedures were conducted by two supervised residents. Primary endpoint was self-completion rate by the residents. Self-completion rate was 67% in the SB Jr group, which was significantly higher than that in the Flush group (39%, P = 0.01). Even after exclusion of four patients in the SB Jr group in whom C-ESD was completed using the WJ needle-knife alone, the self-completion rate was significantly higher (63% vs 39%; P = 0.03). Median procedure time among the self-completion cases did not differ significantly between the two groups (59 vs 51 min; P = 0.14). No fatal adverse events were observed in either group. In this single-center phase II trial, scissor-type knife significantly improved residents' self-completion rate for C-ESD, with no increase in procedure time or adverse events. A multicenter trial would be warranted to confirm the validity of the present study. © 2016 Japan Gastroenterological Endoscopy Society.

  20. Call-to-balloon time dashboard in patients with ST-segment elevation myocardial infarction results in significant improvement in the logistic chain.

    Science.gov (United States)

    Hermans, Maaike P J; Velders, Matthijs A; Smeekes, Martin; Drexhage, Olivier S; Hautvast, Raymond W M; Ytsma, Timon; Schalij, Martin J; Umans, Victor A W M

    2017-08-04

    Timely reperfusion with primary percutaneous coronary intervention (pPCI) in ST-segment elevation myocardial infarction (STEMI) patients is associated with superior clinical outcomes. Aiming to reduce ischaemic time, an innovative system for home-to-hospital (H2H) time monitoring was implemented, which enabled real-time evaluation of ischaemic time intervals, regular feedback and improvements in the logistic chain. The objective of this study was to assess the results after implementation of the H2H dashboard for monitoring and evaluation of ischaemic time in STEMI patients. Ischaemic time in STEMI patients transported by emergency medical services (EMS) and treated with pPCI in the Noordwest Ziekenhuis, Alkmaar before (2008-2009; n=495) and after the implementation of the H2H dashboard (2011-2014; n=441) was compared. Median time intervals were significantly shorter in the H2H group (door-to-balloon time 32 [IQR 25-43] vs. 40 [IQR 28-55] minutes, p-value dashboard was independently associated with shorter time delays. Real-time monitoring and feedback on time delay with the H2H dashboard improves the logistic chain in STEMI patients, resulting in shorter ischaemic time intervals.

  1. Improving significantly the failure strain and work hardening response of LPSO-strengthened Mg-Y-Zn-Al alloy via hot extrusion speed control

    Science.gov (United States)

    Tan, Xinghe; Chee, Winston; Chan, Jimmy; Kwok, Richard; Gupta, Manoj

    2017-07-01

    The effect of hot extrusion speed on the microstructure and mechanical properties of MgY1.06Zn0.76Al0.42 (at%) alloy strengthened by the novel long-period stacking ordered (LPSO) phase was systematically investigated. Increase in the speed of extrusion accelerated dynamic recrystallization of α-Mg via particle-stimulated nucleation and grain growth in the alloy. The intensive recrystallization and grain growth events weakened the conventional basal texture and Hall-Petch strengthening in the alloy which led to significant improvement in its failure strain from 4.9% to 19.6%. The critical strengthening contribution from LPSO phase known for attributing high strength to the alloy was observed to be greatly undermined by the parallel competition from texture weakening and the adverse Hall-Petch effect when the alloy was extruded at higher speed. Absence of work hardening interestingly observed in the alloy extruded at lower speed was discussed in terms of its ultra-fine grained microstructure which promoted the condition of steady-state defect density in the alloy; where dislocation annihilation balances out the generation of new dislocations during plastic deformation. One approach to improve work hardening response of the alloy to prevent unstable deformation and abrupt failure in service is to increase the grain diameter in the alloy by judiciously increasing the extrusion speed.

  2. Functional abilities and cognitive decline in adult and aging intellectual disabilities. Psychometric validation of an Italian version of the Alzheimer's Functional Assessment Tool (AFAST): analysis of its clinical significance with linear statistics and artificial neural networks.

    Science.gov (United States)

    De Vreese, L P; Gomiero, T; Uberti, M; De Bastiani, E; Weger, E; Mantesso, U; Marangoni, A

    2015-04-01

    (a) A psychometric validation of an Italian version of the Alzheimer's Functional Assessment Tool scale (AFAST-I), designed for informant-based assessment of the degree of impairment and of assistance required in seven basic daily activities in adult/elderly people with intellectual disabilities (ID) and (suspected) dementia; (b) a pilot analysis of its clinical significance with traditional statistical procedures and with an artificial neural network. AFAST-I was administered to the professional caregivers of 61 adults/seniors with ID with a mean age (± SD) of 53.4 (± 7.7) years (36% with Down syndrome). Internal consistency (Cronbach's α coefficient), inter/intra-rater reliabilities (intra-class coefficients, ICC) and concurrent, convergent and discriminant validity (Pearson's r coefficients) were computed. Clinical significance was probed by analysing the relationships among AFAST-I scores and the Sum of Cognitive Scores (SCS) and the Sum of Social Scores (SOS) of the Dementia Questionnaire for Persons with Intellectual Disabilities (DMR-I) after standardisation of their raw scores in equivalent scores (ES). An adaptive artificial system (AutoContractive Maps, AutoCM) was applied to all the variables recorded in the study sample, aimed at uncovering which variable occupies a central position and supports the entire network made up of the remaining variables interconnected among themselves with different weights. AFAST-I shows a high level of internal homogeneity with a Cronbach's α coefficient of 0.92. Inter-rater and intra-rater reliabilities were also excellent with ICC correlations of 0.96 and 0.93, respectively. The results of the analyses of the different AFAST-I validities all go in the expected direction: concurrent validity (r=-0.87 with ADL); convergent validity (r=0.63 with SCS; r=0.61 with SOS); discriminant validity (r=0.21 with the frequency of occurrence of dementia-related Behavioral Excesses of the Assessment for Adults with Developmental

  3. Forecasting experiments of a dynamical-statistical model of the sea surface temperature anomaly field based on the improved self-memorization principle

    Science.gov (United States)

    Hong, Mei; Chen, Xi; Zhang, Ren; Wang, Dong; Shen, Shuanghe; Singh, Vijay P.

    2018-04-01

    With the objective of tackling the problem of inaccurate long-term El Niño-Southern Oscillation (ENSO) forecasts, this paper develops a new dynamical-statistical forecast model of the sea surface temperature anomaly (SSTA) field. To avoid single initial prediction values, a self-memorization principle is introduced to improve the dynamical reconstruction model, thus making the model more appropriate for describing such chaotic systems as ENSO events. The improved dynamical-statistical model of the SSTA field is used to predict SSTA in the equatorial eastern Pacific and during El Niño and La Niña events. The long-term step-by-step forecast results and cross-validated retroactive hindcast results of time series T1 and T2 are found to be satisfactory, with a Pearson correlation coefficient of approximately 0.80 and a mean absolute percentage error (MAPE) of less than 15 %. The corresponding forecast SSTA field is accurate in that not only is the forecast shape similar to the actual field but also the contour lines are essentially the same. This model can also be used to forecast the ENSO index. The temporal correlation coefficient is 0.8062, and the MAPE value of 19.55 % is small. The difference between forecast results in spring and those in autumn is not high, indicating that the improved model can overcome the spring predictability barrier to some extent. Compared with six mature models published previously, the present model has an advantage in prediction precision and length, and is a novel exploration of the ENSO forecast method.

  4. MO-FG-204-03: Using Edge-Preserving Algorithm for Significantly Improved Image-Domain Material Decomposition in Dual Energy CT

    International Nuclear Information System (INIS)

    Zhao, W; Niu, T; Xing, L; Xiong, G; Elmore, K; Min, J; Zhu, J; Wang, L

    2015-01-01

    Purpose: To significantly improve dual energy CT (DECT) imaging by establishing a new theoretical framework of image-domain material decomposition with incorporation of edge-preserving techniques. Methods: The proposed algorithm, HYPR-NLM, combines the edge-preserving non-local mean filter (NLM) with the HYPR-LR (Local HighlY constrained backPRojection Reconstruction) framework. Image denoising using HYPR-LR framework depends on the noise level of the composite image which is the average of the different energy images. For DECT, the composite image is the average of high- and low-energy images. To further reduce noise, one may want to increase the window size of the filter of the HYPR-LR, leading resolution degradation. By incorporating the NLM filtering and the HYPR-LR framework, HYPR-NLM reduces the boost material decomposition noise using energy information redundancies as well as the non-local mean. We demonstrate the noise reduction and resolution preservation of the algorithm with both iodine concentration numerical phantom and clinical patient data by comparing the HYPR-NLM algorithm to the direct matrix inversion, HYPR-LR and iterative image-domain material decomposition (Iter-DECT). Results: The results show iterative material decomposition method reduces noise to the lowest level and provides improved DECT images. HYPR-NLM significantly reduces noise while preserving the accuracy of quantitative measurement and resolution. For the iodine concentration numerical phantom, the averaged noise levels are about 2.0, 0.7, 0.2 and 0.4 for direct inversion, HYPR-LR, Iter- DECT and HYPR-NLM, respectively. For the patient data, the noise levels of the water images are about 0.36, 0.16, 0.12 and 0.13 for direct inversion, HYPR-LR, Iter-DECT and HYPR-NLM, respectively. Difference images of both HYPR-LR and Iter-DECT show edge effect, while no significant edge effect is shown for HYPR-NLM, suggesting spatial resolution is well preserved for HYPR-NLM. Conclusion: HYPR

  5. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  6. YH12852, a potent and highly selective 5-HT4 receptor agonist, significantly improves both upper and lower gastrointestinal motility in a guinea pig model of postoperative ileus.

    Science.gov (United States)

    Hussain, Z; Lee, Y J; Yang, H; Jeong, E J; Sim, J Y; Park, H

    2017-10-01

    Postoperative ileus (POI) is a transient gastrointestinal (GI) dysmotility that commonly develops after abdominal surgery. YH12852, a novel, potent and highly selective 5-hydroxytryptamine 4 (5-HT 4 ) receptor agonist, has been shown to improve both upper and lower GI motility in various animal studies and may have applications for the treatment of POI. Here, we investigated the effects and mechanism of action of YH12852 in a guinea pig model of POI to explore its therapeutic potential. The guinea pig model of POI was created by laparotomy, evisceration, and gentle manipulation of the cecum for 60 seconds, followed by closure with sutures under anesthesia. Group 1 received an oral administration of vehicle or YH12852 (1, 3, 10 or 30 mg/kg) only, while POI Group 2 was intraperitoneally pretreated with vehicle or 5-HT 4 receptor antagonist GR113808 (10 mg/kg) prior to oral dosing of vehicle or YH12852 (3 or 10 mg/kg). Upper GI transit was evaluated by assessing the migration of a charcoal mixture in the small intestine, while lower GI transit was assessed via measurement of fecal pellet output (FPO). YH12852 significantly accelerated upper and lower GI transit at the doses of 3, 10, and 30 mg/kg and reached its maximal effect at 10 mg/kg. These effects were significantly blocked by pretreatment of GR113808 10 mg/kg. Oral administration of YH12852 significantly accelerates and restores delayed upper and lower GI transit in a guinea pig model of POI. This drug may serve as a useful candidate for the treatment of postoperative ileus. © 2017 John Wiley & Sons Ltd.

  7. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  8. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  9. Successfully reducing newborn asphyxia in the labour unit in a large academic medical centre: a quality improvement project using statistical process control.

    Science.gov (United States)

    Hollesen, Rikke von Benzon; Johansen, Rie Laurine Rosenthal; Rørbye, Christina; Munk, Louise; Barker, Pierre; Kjaerbye-Thygesen, Anette

    2018-02-03

    A safe delivery is part of a good start in life, and a continuous focus on preventing harm during delivery is crucial, even in settings with a good safety record. In January 2013, the labour unit at Copenhagen University Hospital, Hvidovre, undertook a quality improvement (QI) project to prevent asphyxia and reduced the percentage of newborns with asphyxia by 48%. The change theory consisted of two primary elements: (1) the clinical content, including three clinical bundles of evidence-based care, a 'delivery bundle', an 'oxytocin bundle' and a 'vacuum extraction bundle'; (2) an implementation theory, including improving skills in interpretation of cardiotocography, use of QI methods and participation in a national learning network. The Model for Improvement and Deming's system of profound knowledge were used as a methodological framework. Data on compliance with the care bundles and the number of deliveries between newborns with asphyxia (Apgar statistical process control. Compliance with all three clinical care bundles improved to 95% or more, and the percentages of newborns with pH <7 and Apgar <7 after 5 min were reduced by 48% and 31%, respectively. In general, the QI approach strengthened multidisciplinary teamwork, systematised workflow and structured communication around the deliveries. Changes included making a standard memo in the medical record, the use of a bedside whiteboard, bedside handovers, shared decisions with a peer when using an oxytocin infusion and the use of a checklist before vacuum extractions. This QI project illustrates how aspects of patient safety, such as the prevention of asphyxia, can be improved using QI methods to more reliably implement best practice, even in high-performing systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. In vivo topical application of acetyl aspartic acid increases fibrillin-1 and collagen IV deposition leading to a significant improvement of skin firmness.

    Science.gov (United States)

    Gillbro, J M; Merinville, E; Cattley, K; Al-Bader, T; Hagforsen, E; Nilsson, M; Mavon, A

    2015-10-01

    Acetyl aspartic acid (A-A-A) was discovered through gene array analysis with corresponding Cmap analysis. We found that A-A-A increased keratinocyte regeneration, inhibited dermal matrix metalloprotease (MMP) expression and relieved fibroblast stiffness through reduction of the fibroblast stiffness marker F-actin. Dermal absorption studies showed successful delivery to both the epidermal and dermal regions, and in-use trial demonstrated that 1% A-A-A was well tolerated. In this study, the aim was to investigate whether A-A-A could stimulate the synthesis of extracellular matrix supporting proteins in vivo and thereby improving the viscoelastic properties of human skin by conducting a dual histological and biophysical clinical study. Two separate double-blind vehicle-controlled in vivo studies were conducted using a 1% A-A-A containing oil-in-water (o/w) emulsion. In the histological study, 16 female volunteers (>55 years of age) exhibiting photodamaged skin on their forearm were included, investigating the effect of a 12-day treatment of A-A-A on collagen IV (COLIV) and fibrillin-1. In a subsequent pilot study, 0.1% retinol was used for comparison to A-A-A (1%). The biomechanical properties of the skin were assessed in a panel of 16 women (>45 years of age) using the standard Cutometer MPA580 after topical application of the test products for 28 days. The use of multiple suction enabled the assessment of F4, an area parameter specifically representing skin firmness. Twelve-day topical application of 1% A-A-A significantly increased COLIV and fibrillin with 13% and 6%, respectively, compared to vehicle. 1% A-A-A and 0.1% retinol were found to significantly reduce F4 after 28 days of treatment by 15.8% and 14.7%, respectively, in the pilot Cutometer study. No significant difference was found between retinol and A-A-A. However, only A-A-A exhibited a significant effect vs. vehicle on skin firmness which indicated the incremental benefit of A-A-A as a skin

  11. Low- and high-volume of intensive endurance training significantly improves maximal oxygen uptake after 10-weeks of training in healthy men.

    Directory of Open Access Journals (Sweden)

    Arnt Erik Tjønna

    Full Text Available Regular exercise training improves maximal oxygen uptake (VO2max, but the optimal intensity and volume necessary to obtain maximal benefit remains to be defined. A growing body of evidence suggests that exercise training with low-volume but high-intensity may be a time-efficient means to achieve health benefits. In the present study, we measured changes in VO2max and traditional cardiovascular risk factors after a 10 wk. training protocol that involved three weekly high-intensity interval sessions. One group followed a protocol which consisted of 4×4 min at 90% of maximal heart rate (HRmax interspersed with 3 min active recovery at 70% HRmax (4-AIT, the other group performed a single bout protocol that consisted of 1×4 min at 90% HRmax (1-AIT. Twenty-six inactive but otherwise healthy overweight men (BMI: 25-30, age: 35-45 y were randomized to either 1-AIT (n = 11 or 4-AIT (n = 13. After training, VO2max increased by 10% (∼5.0 mL⋅kg(-1⋅min(-1 and 13% (∼6.5 mL⋅kg(-1⋅min(-1 after 1-AIT and 4-AIT, respectively (group difference, p = 0.08. Oxygen cost during running at a sub-maximal workload was reduced by 14% and 13% after 1-AIT and 4-AIT, respectively. Systolic blood pressure decreased by 7.1 and 2.6 mmHg after 1-AIT and 4-AIT respectively, while diastolic pressure decreased by 7.7 and 6.1 mmHg (group difference, p = 0.84. Both groups had a similar ∼5% decrease in fasting glucose. Body fat, total cholesterol, LDL-cholesterol, and ox-LDL cholesterol only were significantly reduced after 4-AIT. Our data suggest that a single bout of AIT performed three times per week may be a time-efficient strategy to improve VO2max and reduce blood pressure and fasting glucose in previously inactive but otherwise healthy middle-aged individuals. The 1-AIT type of exercise training may be readily implemented as part of activities of daily living and could easily be translated into programs designed to improve public health

  12. Significant Improvement of Puncture Accuracy and Fluoroscopy Reduction in Percutaneous Transforaminal Endoscopic Discectomy With Novel Lumbar Location System: Preliminary Report of Prospective Hello Study.

    Science.gov (United States)

    Fan, Guoxin; Guan, Xiaofei; Zhang, Hailong; Wu, Xinbo; Gu, Xin; Gu, Guangfei; Fan, Yunshan; He, Shisheng

    2015-12-01

    Prospective nonrandomized control study.The study aimed to investigate the implication of the HE's Lumbar LOcation (HELLO) system in improving the puncture accuracy and reducing fluoroscopy in percutaneous transforaminal endoscopic discectomy (PTED).Percutaneous transforaminal endoscopic discectomy is one of the most popular minimally invasive spine surgeries that heavily depend on repeated fluoroscopy. Increased fluoroscopy will induce higher radiation exposure to surgeons and patients. Accurate puncture in PTED can be achieved by accurate preoperative location and definite trajectory.The HELLO system mainly consists of self-made surface locator and puncture-assisted device. The surface locator was used to identify the exact puncture target and the puncture-assisted device was used to optimize the puncture trajectory. Patients who had single L4/5 or L5/S1 lumbar intervertebral disc herniation and underwent PTED were included the study. Patients receiving the HELLO system were assigned in Group A, and those taking conventional method were assigned in Group B. Study primary endpoint was puncture times and fluoroscopic times, and the secondary endpoint was location time and operation time.A total of 62 patients who received PTED were included in this study. The average age was 45.35 ± 8.70 years in Group A and 46.61 ± 7.84 years in Group B (P = 0.552). There were no significant differences in gender, body mass index, conservative time, and surgical segment between the 2 groups (P > 0.05). The puncture times were 1.19 ± 0.48 in Group A and 6.03 ± 1.87 in Group B (P HELLO system is accurate preoperative location and definite trajectory. This preliminary report indicated that the HELLO system significantly improves the puncture accuracy of PTED and reduces the fluoroscopic times, preoperative location time, as well as operation time. (ChiCTR-ICR-15006730).

  13. Prospective trial of angiography and embolization for all grade III to V blunt splenic injuries: nonoperative management success rate is significantly improved.

    Science.gov (United States)

    Miller, Preston R; Chang, Michael C; Hoth, J Jason; Mowery, Nathan T; Hildreth, Amy N; Martin, R Shayn; Holmes, James H; Meredith, J Wayne; Requarth, Jay A

    2014-04-01

    Nonoperative management (NOM) of blunt splenic injury is well accepted. Substantial failure rates in higher injury grades remain common, with one large study reporting rates of 19.6%, 33.3%, and 75% for grades III, IV, and V, respectively. Retrospective data show angiography and embolization can increase salvage rates in these severe injuries. We developed a protocol requiring referral of all blunt splenic injuries, grades III to V, without indication for immediate operation for angiography and embolization. We hypothesized that angiography and embolization of high-grade blunt splenic injury would reduce NOM failure rates in this population. This was a prospective study at our Level I trauma center as part of a performance-improvement project. Demographics, injury characteristics, and outcomes were compared with historic controls. The protocol required all stable patients with grade III to V splenic injuries be referred for angiography and embolization. In historic controls, referral was based on surgeon preference. From January 1, 2010 to December 31, 2012, there were 168 patients with grades III to V spleen injuries admitted; NOM was undertaken in 113 (67%) patients. The protocol was followed in 97 patients, with a failure rate of 5%. Failure rate in the 16 protocol deviations was 25% (p = 0.02). Historic controls from January 1, 2007 to December 31, 2009 were compared with the protocol group. One hundred and fifty-three patients with grade III to V injuries were admitted during this period, 80 (52%) patients underwent attempted NOM. Failure rate was significantly higher than for the protocol group (15%, p = 0.04). Use of a protocol requiring angiography and embolization for all high-grade spleen injuries slated for NOM leads to a significantly decreased failure rate. We recommend angiography and embolization as an adjunct to NOM for all grade III to V splenic injuries. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Significantly improving the yield of recombinant proteins in Bacillus subtilis by a novel powerful mutagenesis tool (ARTP): Alkaline α-amylase as a case study.

    Science.gov (United States)

    Ma, Yingfang; Yang, Haiquan; Chen, Xianzhong; Sun, Bo; Du, Guocheng; Zhou, Zhemin; Song, Jiangning; Fan, You; Shen, Wei

    2015-10-01

    In this study, atmospheric and room temperature plasma (ARTP), a promising mutation breeding technique, was successfully applied to generate Bacillus subtilis mutants that yielded large quantities of recombinant protein. The high throughput screening platform was implemented to select those mutants with the highest yield of recombinant alkaline α-amylase (AMY), including the preferred mutant B. subtilis WB600 mut-12#. The yield and productivity of recombinant AMY in B. subtilis WB600 mut-12# increased 35.0% and 8.8%, respectively, the extracellular protein concentration of which increased 37.9%. B. subtilis WB600 mut-12# exhibited good genetic stability. Cells from B. subtilis WB600 mut-12# became shorter and wider than those from the wild-type. This study is the first to report a novel powerful mutagenesis tool (ARTP) that significantly improves the yield of recombinant proteins in B. subtilis and may therefore play an important role in the high expression level of proteins in recombinant microbial hosts. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Cross accumulative roll bonding—A novel mechanical technique for significant improvement of stir-cast Al/Al2O3 nanocomposite properties

    International Nuclear Information System (INIS)

    Ardakani, Mohammad Reza Kamali; Amirkhanlou, Sajjad; Khorsand, Shohreh

    2014-01-01

    Lightweight metal-matrix nanocomposites (MMNCs—metal matrix with nanosized ceramic particles) can be of significance for automobile, aerospace, and numerous other applications. There are some problems in obtaining suitable mechanical properties of MMNCs, including weak bonding between reinforcement and matrix, non-uniformity of reinforcement nanoparticles and high porosity content. In this study, aluminum/alumina nanocomposite was fabricated by stircasting method. Subsequently, cross accumulative roll bonding (CARB) process was used as an effective method for refinement of microstructure and improvement of mechanical properties. The microstructural evolution and the mechanical properties of the nanocomposites during various CARB cycles were examined by the Archimedes method, X-ray defractometer, scanning electron microscopy and tensile testing. The results showed that the microstructure of the nanocomposite after eight cycles of CARB had an excellent distribution of alumina nanoparticles in aluminum matrix without any remarkable porosity. The X-ray diffraction results showed that the crystallite size of the nanocomposite was 71 nm by employing eight cycles of CARB technique. Mechanical experiment also indicated that the ultimate tensile strength and the elongation of the nanocomposite increased as the number of CARB cycles increased. After eight CARB cycles, ultimate tensile strength and the elongation values reached 344 MPa and 6.4%, which were 3.13 and 3.05 times greater than those of as-cast nanocomposites, respectively

  16. Physiologically-based, predictive analytics using the heart-rate-to-Systolic-Ratio significantly improves the timeliness and accuracy of sepsis prediction compared to SIRS.

    Science.gov (United States)

    Danner, Omar K; Hendren, Sandra; Santiago, Ethel; Nye, Brittany; Abraham, Prasad

    2017-04-01

    Enhancing the efficiency of diagnosis and treatment of severe sepsis by using physiologically-based, predictive analytical strategies has not been fully explored. We hypothesize assessment of heart-rate-to-systolic-ratio significantly increases the timeliness and accuracy of sepsis prediction after emergency department (ED) presentation. We evaluated the records of 53,313 ED patients from a large, urban teaching hospital between January and June 2015. The HR-to-systolic ratio was compared to SIRS criteria for sepsis prediction. There were 884 patients with discharge diagnoses of sepsis, severe sepsis, and/or septic shock. Variations in three presenting variables, heart rate, systolic BP and temperature were determined to be primary early predictors of sepsis with a 74% (654/884) accuracy compared to 34% (304/884) using SIRS criteria (p < 0.0001)in confirmed septic patients. Physiologically-based predictive analytics improved the accuracy and expediency of sepsis identification via detection of variations in HR-to-systolic ratio. This approach may lead to earlier sepsis workup and life-saving interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Polyethylene imine/graphene oxide layer-by-layer surface functionalization for significantly improved limit of detection and binding kinetics of immunoassays on acrylate surfaces.

    Science.gov (United States)

    Miyazaki, Celina M; Mishra, Rohit; Kinahan, David J; Ferreira, Marystela; Ducrée, Jens

    2017-10-01

    Antibody immobilization on polymeric substrates is a key manufacturing step for microfluidic devices that implement sample-to-answer automation of immunoassays. In this work, a simple and versatile method to bio-functionalize poly(methylmethacrylate) (PMMA), a common material of such "Lab-on-a-Chip" systems, is proposed; using the Layer-by-Layer (LbL) technique, we assemble nanostructured thin films of poly(ethylene imine) (PEI) and graphene oxide (GO). The wettability of PMMA surfaces was significantly augmented by the surface treatment with (PEI/GO) 5 film, with an 81% reduction of the contact angle, while the surface roughness increased by 600%, thus clearly enhancing wettability and antibody binding capacity. When applied to enzyme-linked immunosorbent assays (ELISAs), the limit of detection of PMMA surface was notably improved from 340pgmL -1 on commercial grade polystyrene (PS) and 230pgmL -1 on plain PMMA surfaces to 130pgmL -1 on (PEI/GO) 5 treated PMMA. Furthermore, the accelerated antibody adsorption kinetics on the LbL films of GO allowed to substantially shorten incubation times, e.g. for anti-rat IgG adsorption from 2h down to 15min on conventional and treated surfaces, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  19. Strategies for improving the Voxel-based statistical analysis for animal PET studies: assessment of cerebral glucose metabolism in cat deafness model

    International Nuclear Information System (INIS)

    Kim, Jin Su; Lee, Jae Sung; Park, Min Hyun; Kang, Hye Jin; Im, Ki Chun; Moon, Dae Hyuk; Lim, Sang Moo; Oh, Seung Ha; Lee, Dong Soo

    2007-01-01

    In imaging studies of the human brain, voxel-based statistical analysis method was widely used, since these methods were originally developed for the analysis of the human brain data, they are not optimal for the animal brain data. The aim of this study is to optimize the procedures for the 3D voxel-based statistical analysis of cat FDG PET brain images. A microPET Focus 120 scanner was used. Eight cats underwent FDG PET scans twice before and after inducing the deafness. Only the brain and adjacent regions were extracted from each data set by manual masking. Individual PET image at normal and deaf state was realigned to each other to remove the confounding effects by the different spatial normalization parameters on the results of statistical analyses. Distance between the sampling points on the reference image and kernel size of Gaussian filter applied to the images before estimating the realignment parameters were adjusted to 0.5 mm and 2 mm. Both data was then spatial normalized onto study-specific cat brain template. Spatially normalized PET data were smoothed and voxel-based paired t-test was performed. Cerebral glucose metabolism decreased significantly after the loss of hearing capability in parietal lobes, postcentral gyri, STG, MTG, lTG, and IC at both hemisphere and left SC (FDR corrected P < 0.05, k=50). Cerebral glucose metabolism in deaf cats was found to be significantly higher than in controls in the right cingulate (FDR corrected P < 0.05, k=50). The ROI analysis also showed significant reduction of glucose metabolism in the same areas as in the SPM analysis, except for some regions (P < 0.05). Method for the voxel-based analysis of cat brain PET data was optimized for analysis of cat brain PET. This result was also confirmed by ROI analysis. The results obtained demonstrated the high localization accuracy and specificity of the developed method, and were found to be useful for examining cerebral glucose metabolism in a cat cortical deafness model

  20. Strategies for improving the Voxel-based statistical analysis for animal PET studies: assessment of cerebral glucose metabolism in cat deafness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Su; Lee, Jae Sung; Park, Min Hyun; Kang, Hye Jin; Im, Ki Chun; Moon, Dae Hyuk; Lim, Sang Moo; Oh, Seung Ha; Lee, Dong Soo [Seoul National Univ. College of Medicine, Seoul (Korea, Republic of)

    2007-07-01

    In imaging studies of the human brain, voxel-based statistical analysis method was widely use