WorldWideScience

Sample records for nonparametric residue analysis

  1. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  2. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    Science.gov (United States)

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  4. Bayesian nonparametric inference on quantile residual life function: Application to breast cancer data.

    Science.gov (United States)

    Park, Taeyoung; Jeong, Jong-Hyeon; Lee, Jae Won

    2012-08-15

    There is often an interest in estimating a residual life function as a summary measure of survival data. For ease in presentation of the potential therapeutic effect of a new drug, investigators may summarize survival data in terms of the remaining life years of patients. Under heavy right censoring, however, some reasonably high quantiles (e.g., median) of a residual lifetime distribution cannot be always estimated via a popular nonparametric approach on the basis of the Kaplan-Meier estimator. To overcome the difficulties in dealing with heavily censored survival data, this paper develops a Bayesian nonparametric approach that takes advantage of a fully model-based but highly flexible probabilistic framework. We use a Dirichlet process mixture of Weibull distributions to avoid strong parametric assumptions on the unknown failure time distribution, making it possible to estimate any quantile residual life function under heavy censoring. Posterior computation through Markov chain Monte Carlo is straightforward and efficient because of conjugacy properties and partial collapse. We illustrate the proposed methods by using both simulated data and heavily censored survival data from a recent breast cancer clinical trial conducted by the National Surgical Adjuvant Breast and Bowel Project. Copyright © 2012 John Wiley & Sons, Ltd.

  5. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...

  6. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    Science.gov (United States)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  7. Nonparametric analysis of blocked ordered categories data: some examples revisited

    Directory of Open Access Journals (Sweden)

    O. Thas

    2006-08-01

    Full Text Available Nonparametric analysis for general block designs can be given by using the Cochran-Mantel-Haenszel (CMH statistics. We demonstrate this with four examples and note that several well-known nonparametric statistics are special cases of CMH statistics.

  8. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  9. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  10. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  11. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  13. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  14. Weak Disposability in Nonparametric Production Analysis with Undesirable Outputs

    NARCIS (Netherlands)

    Kuosmanen, T.K.

    2005-01-01

    Environmental Economics and Natural Resources Group at Wageningen University in The Netherlands Weak disposability of outputs means that firms can abate harmful emissions by decreasing the activity level. Modeling weak disposability in nonparametric production analysis has caused some confusion.

  15. The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity.

    Science.gov (United States)

    Rights, Jason D; Sterba, Sonya K

    2016-11-01

    Multilevel data structures are common in the social sciences. Often, such nested data are analysed with multilevel models (MLMs) in which heterogeneity between clusters is modelled by continuously distributed random intercepts and/or slopes. Alternatively, the non-parametric multilevel regression mixture model (NPMM) can accommodate the same nested data structures through discrete latent class variation. The purpose of this article is to delineate analytic relationships between NPMM and MLM parameters that are useful for understanding the indirect interpretation of the NPMM as a non-parametric approximation of the MLM, with relaxed distributional assumptions. We define how seven standard and non-standard MLM specifications can be indirectly approximated by particular NPMM specifications. We provide formulas showing how the NPMM can serve as an approximation of the MLM in terms of intraclass correlation, random coefficient means and (co)variances, heteroscedasticity of residuals at level 1, and heteroscedasticity of residuals at level 2. Further, we discuss how these relationships can be useful in practice. The specific relationships are illustrated with simulated graphical demonstrations, and direct and indirect interpretations of NPMM classes are contrasted. We provide an R function to aid in implementing and visualizing an indirect interpretation of NPMM classes. An empirical example is presented and future directions are discussed. © 2016 The British Psychological Society.

  16. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  17. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural ...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  18. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  19. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  20. Nonparametric statistics with applications to science and engineering

    CERN Document Server

    Kvam, Paul H

    2007-01-01

    A thorough and definitive book that fully addresses traditional and modern-day topics of nonparametric statistics This book presents a practical approach to nonparametric statistical analysis and provides comprehensive coverage of both established and newly developed methods. With the use of MATLAB, the authors present information on theorems and rank tests in an applied fashion, with an emphasis on modern methods in regression and curve fitting, bootstrap confidence intervals, splines, wavelets, empirical likelihood, and goodness-of-fit testing. Nonparametric Statistics with Applications to Science and Engineering begins with succinct coverage of basic results for order statistics, methods of categorical data analysis, nonparametric regression, and curve fitting methods. The authors then focus on nonparametric procedures that are becoming more relevant to engineering researchers and practitioners. The important fundamental materials needed to effectively learn and apply the discussed methods are also provide...

  1. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    Science.gov (United States)

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  2. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  3. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  4. Bayesian nonparametric estimation of continuous monotone functions with applications to dose-response analysis.

    Science.gov (United States)

    Bornkamp, Björn; Ickstadt, Katja

    2009-03-01

    In this article, we consider monotone nonparametric regression in a Bayesian framework. The monotone function is modeled as a mixture of shifted and scaled parametric probability distribution functions, and a general random probability measure is assumed as the prior for the mixing distribution. We investigate the choice of the underlying parametric distribution function and find that the two-sided power distribution function is well suited both from a computational and mathematical point of view. The model is motivated by traditional nonlinear models for dose-response analysis, and provides possibilities to elicitate informative prior distributions on different aspects of the curve. The method is compared with other recent approaches to monotone nonparametric regression in a simulation study and is illustrated on a data set from dose-response analysis.

  5. Driving Style Analysis Using Primitive Driving Patterns With Bayesian Nonparametric Approaches

    OpenAIRE

    Wang, Wenshuo; Xi, Junqiang; Zhao, Ding

    2017-01-01

    Analysis and recognition of driving styles are profoundly important to intelligent transportation and vehicle calibration. This paper presents a novel driving style analysis framework using the primitive driving patterns learned from naturalistic driving data. In order to achieve this, first, a Bayesian nonparametric learning method based on a hidden semi-Markov model (HSMM) is introduced to extract primitive driving patterns from time series driving data without prior knowledge of the number...

  6. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  7. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  8. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...

  9. A Structural Labor Supply Model with Nonparametric Preferences

    NARCIS (Netherlands)

    van Soest, A.H.O.; Das, J.W.M.; Gong, X.

    2000-01-01

    Nonparametric techniques are usually seen as a statistic device for data description and exploration, and not as a tool for estimating models with a richer economic structure, which are often required for policy analysis.This paper presents an example where nonparametric flexibility can be attained

  10. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  11. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  12. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  13. Data analysis with small samples and non-normal data nonparametrics and other strategies

    CERN Document Server

    Siebert, Carl F

    2017-01-01

    Written in everyday language for non-statisticians, this book provides all the information needed to successfully conduct nonparametric analyses. This ideal reference book provides step-by-step instructions to lead the reader through each analysis, screenshots of the software and output, and case scenarios to illustrate of all the analytic techniques.

  14. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    Science.gov (United States)

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  15. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  16. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  17. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  18. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  19. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  20. Parametric and nonparametric Granger causality testing: Linkages between international stock markets

    Science.gov (United States)

    De Gooijer, Jan G.; Sivarajasingham, Selliah

    2008-04-01

    This study investigates long-term linear and nonlinear causal linkages among eleven stock markets, six industrialized markets and five emerging markets of South-East Asia. We cover the period 1987-2006, taking into account the on-set of the Asian financial crisis of 1997. We first apply a test for the presence of general nonlinearity in vector time series. Substantial differences exist between the pre- and post-crisis period in terms of the total number of significant nonlinear relationships. We then examine both periods, using a new nonparametric test for Granger noncausality and the conventional parametric Granger noncausality test. One major finding is that the Asian stock markets have become more internationally integrated after the Asian financial crisis. An exception is the Sri Lankan market with almost no significant long-term linear and nonlinear causal linkages with other markets. To ensure that any causality is strictly nonlinear in nature, we also examine the nonlinear causal relationships of VAR filtered residuals and VAR filtered squared residuals for the post-crisis sample. We find quite a few remaining significant bi- and uni-directional causal nonlinear relationships in these series. Finally, after filtering the VAR-residuals with GARCH-BEKK models, we show that the nonparametric test statistics are substantially smaller in both magnitude and statistical significance than those before filtering. This indicates that nonlinear causality can, to a large extent, be explained by simple volatility effects.

  1. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  2. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  3. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  4. Comparative analysis of automotive paints by laser induced breakdown spectroscopy and nonparametric permutation tests

    International Nuclear Information System (INIS)

    McIntee, Erin; Viglino, Emilie; Rinke, Caitlin; Kumor, Stephanie; Ni Liqiang; Sigman, Michael E.

    2010-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been investigated for the discrimination of automobile paint samples. Paint samples from automobiles of different makes, models, and years were collected and separated into sets based on the color, presence or absence of effect pigments and the number of paint layers. Twelve LIBS spectra were obtained for each paint sample, each an average of a five single shot 'drill down' spectra from consecutive laser ablations in the same spot on the sample. Analyses by a nonparametric permutation test and a parametric Wald test were performed to determine the extent of discrimination within each set of paint samples. The discrimination power and Type I error were assessed for each data analysis method. Conversion of the spectral intensity to a log-scale (base 10) resulted in a higher overall discrimination power while observing the same significance level. Working on the log-scale, the nonparametric permutation tests gave an overall 89.83% discrimination power with a size of Type I error being 4.44% at the nominal significance level of 5%. White paint samples, as a group, were the most difficult to differentiate with the power being only 86.56% followed by 95.83% for black paint samples. Parametric analysis of the data set produced lower discrimination (85.17%) with 3.33% Type I errors, which is not recommended for both theoretical and practical considerations. The nonparametric testing method is applicable across many analytical comparisons, with the specific application described here being the pairwise comparison of automotive paint samples.

  5. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  6. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  7. Application of nonparametric statistic method for DNBR limit calculation

    International Nuclear Information System (INIS)

    Dong Bo; Kuang Bo; Zhu Xuenong

    2013-01-01

    Background: Nonparametric statistical method is a kind of statistical inference method not depending on a certain distribution; it calculates the tolerance limits under certain probability level and confidence through sampling methods. The DNBR margin is one important parameter of NPP design, which presents the safety level of NPP. Purpose and Methods: This paper uses nonparametric statistical method basing on Wilks formula and VIPER-01 subchannel analysis code to calculate the DNBR design limits (DL) of 300 MW NPP (Nuclear Power Plant) during the complete loss of flow accident, simultaneously compared with the DL of DNBR through means of ITDP to get certain DNBR margin. Results: The results indicate that this method can gain 2.96% DNBR margin more than that obtained by ITDP methodology. Conclusions: Because of the reduction of the conservation during analysis process, the nonparametric statistical method can provide greater DNBR margin and the increase of DNBR margin is benefited for the upgrading of core refuel scheme. (authors)

  8. Non-Parametric Analysis of Rating Transition and Default Data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...

  9. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  10. Residual lifetime prediction for lithium-ion battery based on functional principal component analysis and Bayesian approach

    International Nuclear Information System (INIS)

    Cheng, Yujie; Lu, Chen; Li, Tieying; Tao, Laifa

    2015-01-01

    Existing methods for predicting lithium-ion (Li-ion) battery residual lifetime mostly depend on a priori knowledge on aging mechanism, the use of chemical or physical formulation and analytical battery models. This dependence is usually difficult to determine in practice, which restricts the application of these methods. In this study, we propose a new prediction method for Li-ion battery residual lifetime evaluation based on FPCA (functional principal component analysis) and Bayesian approach. The proposed method utilizes FPCA to construct a nonparametric degradation model for Li-ion battery, based on which the residual lifetime and the corresponding confidence interval can be evaluated. Furthermore, an empirical Bayes approach is utilized to achieve real-time updating of the degradation model and concurrently determine residual lifetime distribution. Based on Bayesian updating, a more accurate prediction result and a more precise confidence interval are obtained. Experiments are implemented based on data provided by the NASA Ames Prognostics Center of Excellence. Results confirm that the proposed prediction method performs well in real-time battery residual lifetime prediction. - Highlights: • Capacity is considered functional and FPCA is utilized to extract more information. • No features required which avoids drawbacks induced by feature extraction. • A good combination of both population and individual information. • Avoiding complex aging mechanism and accurate analytical models of batteries. • Easily applicable to different batteries for life prediction and RLD calculation.

  11. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  12. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  13. Nonparametric Change Point Diagnosis Method of Concrete Dam Crack Behavior Abnormality

    Directory of Open Access Journals (Sweden)

    Zhanchao Li

    2013-01-01

    Full Text Available The study on diagnosis method of concrete crack behavior abnormality has always been a hot spot and difficulty in the safety monitoring field of hydraulic structure. Based on the performance of concrete dam crack behavior abnormality in parametric statistical model and nonparametric statistical model, the internal relation between concrete dam crack behavior abnormality and statistical change point theory is deeply analyzed from the model structure instability of parametric statistical model and change of sequence distribution law of nonparametric statistical model. On this basis, through the reduction of change point problem, the establishment of basic nonparametric change point model, and asymptotic analysis on test method of basic change point problem, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is created in consideration of the situation that in practice concrete dam crack behavior may have more abnormality points. And the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is used in the actual project, demonstrating the effectiveness and scientific reasonableness of the method established. Meanwhile, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality has a complete theoretical basis and strong practicality with a broad application prospect in actual project.

  14. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  15. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  16. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  17. Genomic outlier profile analysis: mixture models, null hypotheses, and nonparametric estimation.

    Science.gov (United States)

    Ghosh, Debashis; Chinnaiyan, Arul M

    2009-01-01

    In most analyses of large-scale genomic data sets, differential expression analysis is typically assessed by testing for differences in the mean of the distributions between 2 groups. A recent finding by Tomlins and others (2005) is of a different type of pattern of differential expression in which a fraction of samples in one group have overexpression relative to samples in the other group. In this work, we describe a general mixture model framework for the assessment of this type of expression, called outlier profile analysis. We start by considering the single-gene situation and establishing results on identifiability. We propose 2 nonparametric estimation procedures that have natural links to familiar multiple testing procedures. We then develop multivariate extensions of this methodology to handle genome-wide measurements. The proposed methodologies are compared using simulation studies as well as data from a prostate cancer gene expression study.

  18. Nonparametric Identification and Estimation of Finite Mixture Models of Dynamic Discrete Choices

    OpenAIRE

    Hiroyuki Kasahara; Katsumi Shimotsu

    2006-01-01

    In dynamic discrete choice analysis, controlling for unobserved heterogeneity is an important issue, and finite mixture models provide flexible ways to account for unobserved heterogeneity. This paper studies nonparametric identifiability of type probabilities and type-specific component distributions in finite mixture models of dynamic discrete choices. We derive sufficient conditions for nonparametric identification for various finite mixture models of dynamic discrete choices used in appli...

  19. Theory of nonparametric tests

    CERN Document Server

    Dickhaus, Thorsten

    2018-01-01

    This textbook provides a self-contained presentation of the main concepts and methods of nonparametric statistical testing, with a particular focus on the theoretical foundations of goodness-of-fit tests, rank tests, resampling tests, and projection tests. The substitution principle is employed as a unified approach to the nonparametric test problems discussed. In addition to mathematical theory, it also includes numerous examples and computer implementations. The book is intended for advanced undergraduate, graduate, and postdoc students as well as young researchers. Readers should be familiar with the basic concepts of mathematical statistics typically covered in introductory statistics courses.

  20. Nonparametric Transfer Function Models

    Science.gov (United States)

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  1. Trend Analysis of Pahang River Using Non-Parametric Analysis: Mann Kendalls Trend Test

    International Nuclear Information System (INIS)

    Nur Hishaam Sulaiman; Mohd Khairul Amri Kamarudin; Mohd Khairul Amri Kamarudin; Ahmad Dasuki Mustafa; Muhammad Azizi Amran; Fazureen Azaman; Ismail Zainal Abidin; Norsyuhada Hairoma

    2015-01-01

    Flood is common in Pahang especially during northeast monsoon season from November to February. Three river cross station: Lubuk Paku, Sg. Yap and Temerloh were selected as area of this study. The stream flow and water level data were gathered from DID record. Data set for this study were analysed by using non-parametric analysis, Mann-Kendall Trend Test. The results that obtained from stream flow and water level analysis indicate that there are positively significant trend for Lubuk Paku (0.001) and Sg. Yap (<0.0001) from 1972-2011 with the p-value < 0.05. Temerloh (0.178) data from 1963-2011 recorded no trend for stream flow parameter but negative trend for water level parameter. Hydrological pattern and trend are extremely affected by outside factors such as north east monsoon season that occurred in South China Sea and affected Pahang during November to March. There are other factors such as development and management of the areas which can be considered as factors affected the data and results. Hydrological Pattern is important to indicate the river trend such as stream flow and water level. It can be used as flood mitigation by local authorities. (author)

  2. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  3. A Bayesian approach to the analysis of quantal bioassay studies using nonparametric mixture models.

    Science.gov (United States)

    Fronczyk, Kassandra; Kottas, Athanasios

    2014-03-01

    We develop a Bayesian nonparametric mixture modeling framework for quantal bioassay settings. The approach is built upon modeling dose-dependent response distributions. We adopt a structured nonparametric prior mixture model, which induces a monotonicity restriction for the dose-response curve. Particular emphasis is placed on the key risk assessment goal of calibration for the dose level that corresponds to a specified response. The proposed methodology yields flexible inference for the dose-response relationship as well as for other inferential objectives, as illustrated with two data sets from the literature. © 2013, The International Biometric Society.

  4. Performances of non-parametric statistics in sensitivity analysis and parameter ranking

    International Nuclear Information System (INIS)

    Saltelli, A.

    1987-01-01

    Twelve parametric and non-parametric sensitivity analysis techniques are compared in the case of non-linear model responses. The test models used are taken from the long-term risk analysis for the disposal of high level radioactive waste in a geological formation. They describe the transport of radionuclides through a set of engineered and natural barriers from the repository to the biosphere and to man. The output data from these models are the dose rates affecting the maximum exposed individual of a critical group at a given point in time. All the techniques are applied to the output from the same Monte Carlo simulations, where a modified version of Latin Hypercube method is used for the sample selection. Hypothesis testing is systematically applied to quantify the degree of confidence in the results given by the various sensitivity estimators. The estimators are ranked according to their robustness and stability, on the basis of two test cases. The conclusions are that no estimator can be considered the best from all points of view and recommend the use of more than just one estimator in sensitivity analysis

  5. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  6. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  7. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  8. A survey of residual analysis and a new test of residual trend.

    Science.gov (United States)

    McDowell, J J; Calvin, Olivia L; Klapes, Bryan

    2016-05-01

    A survey of residual analysis in behavior-analytic research reveals that existing methods are problematic in one way or another. A new test for residual trends is proposed that avoids the problematic features of the existing methods. It entails fitting cubic polynomials to sets of residuals and comparing their effect sizes to those that would be expected if the sets of residuals were random. To this end, sampling distributions of effect sizes for fits of a cubic polynomial to random data were obtained by generating sets of random standardized residuals of various sizes, n. A cubic polynomial was then fitted to each set of residuals and its effect size was calculated. This yielded a sampling distribution of effect sizes for each n. To test for a residual trend in experimental data, the median effect size of cubic-polynomial fits to sets of experimental residuals can be compared to the median of the corresponding sampling distribution of effect sizes for random residuals using a sign test. An example from the literature, which entailed comparing mathematical and computational models of continuous choice, is used to illustrate the utility of the test. © 2016 Society for the Experimental Analysis of Behavior.

  9. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Parametric vs. Nonparametric Regression Modelling within Clinical Decision Support

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2017-01-01

    Roč. 5, č. 1 (2017), s. 21-27 ISSN 1805-8698 R&D Projects: GA ČR GA17-01251S Institutional support: RVO:67985807 Keywords : decision support systems * decision rules * statistical analysis * nonparametric regression Subject RIV: IN - Informatics, Computer Science OBOR OECD: Statistics and probability

  11. On Cooper's Nonparametric Test.

    Science.gov (United States)

    Schmeidler, James

    1978-01-01

    The basic assumption of Cooper's nonparametric test for trend (EJ 125 069) is questioned. It is contended that the proper assumption alters the distribution of the statistic and reduces its usefulness. (JKS)

  12. CATDAT - A program for parametric and nonparametric categorical data analysis user's manual, Version 1.0

    International Nuclear Information System (INIS)

    Peterson, James R.; Haas, Timothy C.; Lee, Danny C.

    2000-01-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network

  13. Nonparametric Bayesian density estimation on manifolds with applications to planar shapes.

    Science.gov (United States)

    Bhattacharya, Abhishek; Dunson, David B

    2010-12-01

    Statistical analysis on landmark-based shape spaces has diverse applications in morphometrics, medical diagnostics, machine vision and other areas. These shape spaces are non-Euclidean quotient manifolds. To conduct nonparametric inferences, one may define notions of centre and spread on this manifold and work with their estimates. However, it is useful to consider full likelihood-based methods, which allow nonparametric estimation of the probability density. This article proposes a broad class of mixture models constructed using suitable kernels on a general compact metric space and then on the planar shape space in particular. Following a Bayesian approach with a nonparametric prior on the mixing distribution, conditions are obtained under which the Kullback-Leibler property holds, implying large support and weak posterior consistency. Gibbs sampling methods are developed for posterior computation, and the methods are applied to problems in density estimation and classification with shape-based predictors. Simulation studies show improved estimation performance relative to existing approaches.

  14. Nonparametric e-Mixture Estimation.

    Science.gov (United States)

    Takano, Ken; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2016-12-01

    This study considers the common situation in data analysis when there are few observations of the distribution of interest or the target distribution, while abundant observations are available from auxiliary distributions. In this situation, it is natural to compensate for the lack of data from the target distribution by using data sets from these auxiliary distributions-in other words, approximating the target distribution in a subspace spanned by a set of auxiliary distributions. Mixture modeling is one of the simplest ways to integrate information from the target and auxiliary distributions in order to express the target distribution as accurately as possible. There are two typical mixtures in the context of information geometry: the [Formula: see text]- and [Formula: see text]-mixtures. The [Formula: see text]-mixture is applied in a variety of research fields because of the presence of the well-known expectation-maximazation algorithm for parameter estimation, whereas the [Formula: see text]-mixture is rarely used because of its difficulty of estimation, particularly for nonparametric models. The [Formula: see text]-mixture, however, is a well-tempered distribution that satisfies the principle of maximum entropy. To model a target distribution with scarce observations accurately, this letter proposes a novel framework for a nonparametric modeling of the [Formula: see text]-mixture and a geometrically inspired estimation algorithm. As numerical examples of the proposed framework, a transfer learning setup is considered. The experimental results show that this framework works well for three types of synthetic data sets, as well as an EEG real-world data set.

  15. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  16. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  17. Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet Process Mixtures.

    Science.gov (United States)

    Filippi, Sarah; Holmes, Chris C; Nieto-Barajas, Luis E

    2016-11-16

    In this article we propose novel Bayesian nonparametric methods using Dirichlet Process Mixture (DPM) models for detecting pairwise dependence between random variables while accounting for uncertainty in the form of the underlying distributions. A key criteria is that the procedures should scale to large data sets. In this regard we find that the formal calculation of the Bayes factor for a dependent-vs.-independent DPM joint probability measure is not feasible computationally. To address this we present Bayesian diagnostic measures for characterising evidence against a "null model" of pairwise independence. In simulation studies, as well as for a real data analysis, we show that our approach provides a useful tool for the exploratory nonparametric Bayesian analysis of large multivariate data sets.

  18. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  19. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    Gugushvili, S.; van der Meulen, F.; Spreij, P.

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context,

  20. A Bayesian nonparametric estimation of distributions and quantiles

    International Nuclear Information System (INIS)

    Poern, K.

    1988-11-01

    The report describes a Bayesian, nonparametric method for the estimation of a distribution function and its quantiles. The method, presupposing random sampling, is nonparametric, so the user has to specify a prior distribution on a space of distributions (and not on a parameter space). In the current application, where the method is used to estimate the uncertainty of a parametric calculational model, the Dirichlet prior distribution is to a large extent determined by the first batch of Monte Carlo-realizations. In this case the results of the estimation technique is very similar to the conventional empirical distribution function. The resulting posterior distribution is also Dirichlet, and thus facilitates the determination of probability (confidence) intervals at any given point in the space of interest. Another advantage is that also the posterior distribution of a specified quantitle can be derived and utilized to determine a probability interval for that quantile. The method was devised for use in the PROPER code package for uncertainty and sensitivity analysis. (orig.)

  1. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  2. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  3. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  4. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  5. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  6. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  7. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  8. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  9. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo; Genton, Marc G.

    2013-01-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric

  10. Simple nonparametric checks for model data fit in CAT

    NARCIS (Netherlands)

    Meijer, R.R.

    2005-01-01

    In this paper, the usefulness of several nonparametric checks is discussed in a computerized adaptive testing (CAT) context. Although there is no tradition of nonparametric scalability in CAT, it can be argued that scalability checks can be useful to investigate, for example, the quality of item

  11. Surface Estimation, Variable Selection, and the Nonparametric Oracle Property.

    Science.gov (United States)

    Storlie, Curtis B; Bondell, Howard D; Reich, Brian J; Zhang, Hao Helen

    2011-04-01

    Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting.

  12. Does Private Tutoring Work? The Effectiveness of Private Tutoring: A Nonparametric Bounds Analysis

    Science.gov (United States)

    Hof, Stefanie

    2014-01-01

    Private tutoring has become popular throughout the world. However, evidence for the effect of private tutoring on students' academic outcome is inconclusive; therefore, this paper presents an alternative framework: a nonparametric bounds method. The present examination uses, for the first time, a large representative data-set in a European setting…

  13. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  14. Residual stress concerns in containment analysis

    International Nuclear Information System (INIS)

    Costantini, F.; Kulak, R. F.; Pfeiffer, P. A.

    1997-01-01

    The manufacturing of steel containment vessels starts with the forming of flat plates into curved plates. A steel containment structure is made by welding individual plates together to form the sections that make up the complex shaped vessels. The metal forming and welding process leaves residual stresses in the vessel walls. Generally, the effect of metal forming residual stresses can be reduced or virtually eliminated by thermally stress relieving the vesseL In large containment vessels this may not be practical and thus the residual stresses due to manufacturing may become important. The residual stresses could possibly tiect the response of the vessel to internal pressurization. When the level of residual stresses is significant it will affect the vessel's response, for instance the yielding pressure and possibly the failure pressure. The paper will address the effect of metal forming residual stresses on the response of a generic pressure vessel to internal pressurization. A scoping analysis investigated the effect of residual forming stresses on the response of an internally pressurized vessel. A simple model was developed to gain understanding of the mechanics of the problem. Residual stresses due to the welding process were not considered in this investigation

  15. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  16. A ¤nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, T.; Scheike, T. H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  17. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    Science.gov (United States)

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  18. Nonparametric Mixture Models for Supervised Image Parcellation.

    Science.gov (United States)

    Sabuncu, Mert R; Yeo, B T Thomas; Van Leemput, Koen; Fischl, Bruce; Golland, Polina

    2009-09-01

    We present a nonparametric, probabilistic mixture model for the supervised parcellation of images. The proposed model yields segmentation algorithms conceptually similar to the recently developed label fusion methods, which register a new image with each training image separately. Segmentation is achieved via the fusion of transferred manual labels. We show that in our framework various settings of a model parameter yield algorithms that use image intensity information differently in determining the weight of a training subject during fusion. One particular setting computes a single, global weight per training subject, whereas another setting uses locally varying weights when fusing the training data. The proposed nonparametric parcellation approach capitalizes on recently developed fast and robust pairwise image alignment tools. The use of multiple registrations allows the algorithm to be robust to occasional registration failures. We report experiments on 39 volumetric brain MRI scans with expert manual labels for the white matter, cerebral cortex, ventricles and subcortical structures. The results demonstrate that the proposed nonparametric segmentation framework yields significantly better segmentation than state-of-the-art algorithms.

  19. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  1. Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Jiang, Jiancheng; Tong, Xin

    We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS. In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing.

  2. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir; Shafieloo, Arman [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of); Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr [School of Physics, The University of New South Wales, Sydney NSW 2052 (Australia)

    2017-09-01

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation of the base ΛCDM model as cosmology's gold standard.

  3. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  4. Nonparametric Monitoring for Geotechnical Structures Subject to Long-Term Environmental Change

    Directory of Open Access Journals (Sweden)

    Hae-Bum Yun

    2011-01-01

    Full Text Available A nonparametric, data-driven methodology of monitoring for geotechnical structures subject to long-term environmental change is discussed. Avoiding physical assumptions or excessive simplification of the monitored structures, the nonparametric monitoring methodology presented in this paper provides reliable performance-related information particularly when the collection of sensor data is limited. For the validation of the nonparametric methodology, a field case study was performed using a full-scale retaining wall, which had been monitored for three years using three tilt gauges. Using the very limited sensor data, it is demonstrated that important performance-related information, such as drainage performance and sensor damage, could be disentangled from significant daily, seasonal and multiyear environmental variations. Extensive literature review on recent developments of parametric and nonparametric data processing techniques for geotechnical applications is also presented.

  5. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  6. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  7. Robustifying Bayesian nonparametric mixtures for count data.

    Science.gov (United States)

    Canale, Antonio; Prünster, Igor

    2017-03-01

    Our motivating application stems from surveys of natural populations and is characterized by large spatial heterogeneity in the counts, which makes parametric approaches to modeling local animal abundance too restrictive. We adopt a Bayesian nonparametric approach based on mixture models and innovate with respect to popular Dirichlet process mixture of Poisson kernels by increasing the model flexibility at the level both of the kernel and the nonparametric mixing measure. This allows to derive accurate and robust estimates of the distribution of local animal abundance and of the corresponding clusters. The application and a simulation study for different scenarios yield also some general methodological implications. Adding flexibility solely at the level of the mixing measure does not improve inferences, since its impact is severely limited by the rigidity of the Poisson kernel with considerable consequences in terms of bias. However, once a kernel more flexible than the Poisson is chosen, inferences can be robustified by choosing a prior more general than the Dirichlet process. Therefore, to improve the performance of Bayesian nonparametric mixtures for count data one has to enrich the model simultaneously at both levels, the kernel and the mixing measure. © 2016, The International Biometric Society.

  8. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  9. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Takamizawa, Hisashi, E-mail: takamizawa.hisashi@jaea.go.jp; Itoh, Hiroto, E-mail: ito.hiroto@jaea.go.jp; Nishiyama, Yutaka, E-mail: nishiyama.yutaka93@jaea.go.jp

    2016-10-15

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  10. Mapping allostery through computational glycine scanning and correlation analysis of residue-residue contacts.

    Science.gov (United States)

    Johnson, Quentin R; Lindsay, Richard J; Nellas, Ricky B; Fernandez, Elias J; Shen, Tongye

    2015-02-24

    Understanding allosteric mechanisms is essential for the physical control of molecular switches and downstream cellular responses. However, it is difficult to decode essential allosteric motions in a high-throughput scheme. A general two-pronged approach to performing automatic data reduction of simulation trajectories is presented here. The first step involves coarse-graining and identifying the most dynamic residue-residue contacts. The second step is performing principal component analysis of these contacts and extracting the large-scale collective motions expressed via these residue-residue contacts. We demonstrated the method using a protein complex of nuclear receptors. Using atomistic modeling and simulation, we examined the protein complex and a set of 18 glycine point mutations of residues that constitute the binding pocket of the ligand effector. The important motions that are responsible for the allostery are reported. In contrast to conventional induced-fit and lock-and-key binding mechanisms, a novel "frustrated-fit" binding mechanism of RXR for allosteric control was revealed.

  11. Network structure exploration via Bayesian nonparametric models

    International Nuclear Information System (INIS)

    Chen, Y; Wang, X L; Xiang, X; Tang, B Z; Bu, J Z

    2015-01-01

    Complex networks provide a powerful mathematical representation of complex systems in nature and society. To understand complex networks, it is crucial to explore their internal structures, also called structural regularities. The task of network structure exploration is to determine how many groups there are in a complex network and how to group the nodes of the network. Most existing structure exploration methods need to specify either a group number or a certain type of structure when they are applied to a network. In the real world, however, the group number and also the certain type of structure that a network has are usually unknown in advance. To explore structural regularities in complex networks automatically, without any prior knowledge of the group number or the certain type of structure, we extend a probabilistic mixture model that can handle networks with any type of structure but needs to specify a group number using Bayesian nonparametric theory. We also propose a novel Bayesian nonparametric model, called the Bayesian nonparametric mixture (BNPM) model. Experiments conducted on a large number of networks with different structures show that the BNPM model is able to explore structural regularities in networks automatically with a stable, state-of-the-art performance. (paper)

  12. Bioprocess iterative batch-to-batch optimization based on hybrid parametric/nonparametric models.

    Science.gov (United States)

    Teixeira, Ana P; Clemente, João J; Cunha, António E; Carrondo, Manuel J T; Oliveira, Rui

    2006-01-01

    This paper presents a novel method for iterative batch-to-batch dynamic optimization of bioprocesses. The relationship between process performance and control inputs is established by means of hybrid grey-box models combining parametric and nonparametric structures. The bioreactor dynamics are defined by material balance equations, whereas the cell population subsystem is represented by an adjustable mixture of nonparametric and parametric models. Thus optimizations are possible without detailed mechanistic knowledge concerning the biological system. A clustering technique is used to supervise the reliability of the nonparametric subsystem during the optimization. Whenever the nonparametric outputs are unreliable, the objective function is penalized. The technique was evaluated with three simulation case studies. The overall results suggest that the convergence to the optimal process performance may be achieved after a small number of batches. The model unreliability risk constraint along with sampling scheduling are crucial to minimize the experimental effort required to attain a given process performance. In general terms, it may be concluded that the proposed method broadens the application of the hybrid parametric/nonparametric modeling technique to "newer" processes with higher potential for optimization.

  13. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  14. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  15. Nonparametric methods for volatility density estimation

    NARCIS (Netherlands)

    Es, van Bert; Spreij, P.J.C.; Zanten, van J.H.

    2009-01-01

    Stochastic volatility modelling of financial processes has become increasingly popular. The proposed models usually contain a stationary volatility process. We will motivate and review several nonparametric methods for estimation of the density of the volatility process. Both models based on

  16. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    capture the behavior of observed phenomena. Higher-order polynomial and finite-dimensional spline basis models allow for more complicated responses as the...flexibility as these are nonparametric (not constrained to any particular functional form). These should be useful in identifying nonstandard behavior via... deviance ∆ = −2 log(Lreduced/Lfull) is defined in terms of the likelihood function L. For normal error, Lfull = 1, and based on Eq. A-2, we have log

  17. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  18. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  19. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei; Carroll, Raymond J.; Maity, Arnab

    2011-01-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work

  20. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  1. Spectral Envelopes and Additive + Residual Analysis/Synthesis

    Science.gov (United States)

    Rodet, Xavier; Schwarz, Diemo

    The subject of this chapter is the estimation, representation, modification, and use of spectral envelopes in the context of sinusoidal-additive-plus-residual analysis/synthesis. A spectral envelope is an amplitude-vs-frequency function, which may be obtained from the envelope of a short-time spectrum (Rodet et al., 1987; Schwarz, 1998). [Precise definitions of such an envelope and short-time spectrum (STS) are given in Section 2.] The additive-plus-residual analysis/synthesis method is based on a representation of signals in terms of a sum of time-varying sinusoids and of a non-sinusoidal residual signal [e.g., see Serra (1989), Laroche et al. (1993), McAulay and Quatieri (1995), and Ding and Qian (1997)]. Many musical sound signals may be described as a combination of a nearly periodic waveform and colored noise. The nearly periodic part of the signal can be viewed as a sum of sinusoidal components, called partials, with time-varying frequency and amplitude. Such sinusoidal components are easily observed on a spectral analysis display (Fig. 5.1) as obtained, for instance, from a discrete Fourier transform.

  2. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  3. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are point...

  4. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  5. Tank 12H residuals sample analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Shine, E. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hay, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-06-11

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 12H final characterization samples to determine the residual tank inventory prior to grouting. Eleven Tank 12H floor and mound residual material samples and three cooling coil scrape samples were collected and delivered to SRNL between May and August of 2014.

  6. Single molecule force spectroscopy at high data acquisition: A Bayesian nonparametric analysis

    Science.gov (United States)

    Sgouralis, Ioannis; Whitmore, Miles; Lapidus, Lisa; Comstock, Matthew J.; Pressé, Steve

    2018-03-01

    Bayesian nonparametrics (BNPs) are poised to have a deep impact in the analysis of single molecule data as they provide posterior probabilities over entire models consistent with the supplied data, not just model parameters of one preferred model. Thus they provide an elegant and rigorous solution to the difficult problem encountered when selecting an appropriate candidate model. Nevertheless, BNPs' flexibility to learn models and their associated parameters from experimental data is a double-edged sword. Most importantly, BNPs are prone to increasing the complexity of the estimated models due to artifactual features present in time traces. Thus, because of experimental challenges unique to single molecule methods, naive application of available BNP tools is not possible. Here we consider traces with time correlations and, as a specific example, we deal with force spectroscopy traces collected at high acquisition rates. While high acquisition rates are required in order to capture dwells in short-lived molecular states, in this setup, a slow response of the optical trap instrumentation (i.e., trapped beads, ambient fluid, and tethering handles) distorts the molecular signals introducing time correlations into the data that may be misinterpreted as true states by naive BNPs. Our adaptation of BNP tools explicitly takes into consideration these response dynamics, in addition to drift and noise, and makes unsupervised time series analysis of correlated single molecule force spectroscopy measurements possible, even at acquisition rates similar to or below the trap's response times.

  7. Residual gas analysis

    International Nuclear Information System (INIS)

    Berecz, I.

    1982-01-01

    Determination of the residual gas composition in vacuum systems by a special mass spectrometric method was presented. The quadrupole mass spectrometer (QMS) and its application in thin film technology was discussed. Results, partial pressure versus time curves as well as the line spectra of the residual gases in case of the vaporization of a Ti-Pd-Au alloy were demonstrated together with the possible construction schemes of QMS residual gas analysers. (Sz.J.)

  8. QA/QC in pesticide residue analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A [Agrochemicals Unit, Agency' s Laboratories, Seibersdorf (Austria)

    2002-07-01

    This paper outlines problems related to pesticide residue analysis in a regulatory laboratory that are related to: availability of reference materials, as over 1000 pesticide active ingredients are currently in use and over 400 crops represent a large part of a healthy diet; analysis time; availability of samples in sufficient numbers; uncertainties of the procedures.

  9. QA/QC in pesticide residue analysis

    International Nuclear Information System (INIS)

    Ambrus, A.

    2002-01-01

    This paper outlines problems related to pesticide residue analysis in a regulatory laboratory that are related to: availability of reference materials, as over 1000 pesticide active ingredients are currently in use and over 400 crops represent a large part of a healthy diet; analysis time; availability of samples in sufficient numbers; uncertainties of the procedures

  10. Multi-sample nonparametric treatments comparison in medical ...

    African Journals Online (AJOL)

    Multi-sample nonparametric treatments comparison in medical follow-up study with unequal observation processes through simulation and bladder tumour case study. P. L. Tan, N.A. Ibrahim, M.B. Adam, J. Arasan ...

  11. Nonparametric regression using the concept of minimum energy

    International Nuclear Information System (INIS)

    Williams, Mike

    2011-01-01

    It has recently been shown that an unbinned distance-based statistic, the energy, can be used to construct an extremely powerful nonparametric multivariate two sample goodness-of-fit test. An extension to this method that makes it possible to perform nonparametric regression using multiple multivariate data sets is presented in this paper. The technique, which is based on the concept of minimizing the energy of the system, permits determination of parameters of interest without the need for parametric expressions of the parent distributions of the data sets. The application and performance of this new method is discussed in the context of some simple example analyses.

  12. Residual stress analysis in thick uranium films

    International Nuclear Information System (INIS)

    Hodge, A.M.; Foreman, R.J.; Gallegos, G.F.

    2005-01-01

    Residual stress analysis was performed on thick, 1-25 μm, depleted uranium (DU) films deposited on an Al substrate by magnetron sputtering. Two distinct characterization techniques were used to measure substrate curvature before and after deposition. Stress evaluation was performed using the Benabdi/Roche equation, which is based on beam theory of a bi-layer material. The residual stress evolution was studied as a function of coating thickness and applied negative bias voltage (0, -200, -300 V). The stresses developed were always compressive; however, increasing the coating thickness and applying a bias voltage presented a trend towards more tensile stresses and thus an overall reduction of residual stresses

  13. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  14. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  15. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...... instantaneous and convolutive mixing, and the inferred temporal patterns. Spatial maps are seen to capture smooth and localized stimuli-related components, and often identifiable noise components. The implementation is freely available as a GUI/SPM plugin, and we recommend using GPICA as an additional tool when...

  16. Essays on nonparametric econometrics of stochastic volatility

    NARCIS (Netherlands)

    Zu, Y.

    2012-01-01

    Volatility is a concept that describes the variation of financial returns. Measuring and modelling volatility dynamics is an important aspect of financial econometrics. This thesis is concerned with nonparametric approaches to volatility measurement and volatility model validation.

  17. Bayesian Nonparametric Hidden Markov Models with application to the analysis of copy-number-variation in mammalian genomes.

    Science.gov (United States)

    Yau, C; Papaspiliopoulos, O; Roberts, G O; Holmes, C

    2011-01-01

    We consider the development of Bayesian Nonparametric methods for product partition models such as Hidden Markov Models and change point models. Our approach uses a Mixture of Dirichlet Process (MDP) model for the unknown sampling distribution (likelihood) for the observations arising in each state and a computationally efficient data augmentation scheme to aid inference. The method uses novel MCMC methodology which combines recent retrospective sampling methods with the use of slice sampler variables. The methodology is computationally efficient, both in terms of MCMC mixing properties, and robustness to the length of the time series being investigated. Moreover, the method is easy to implement requiring little or no user-interaction. We apply our methodology to the analysis of genomic copy number variation.

  18. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  19. Nonparametric Bayes Modeling of Multivariate Categorical Data.

    Science.gov (United States)

    Dunson, David B; Xing, Chuanhua

    2012-01-01

    Modeling of multivariate unordered categorical (nominal) data is a challenging problem, particularly in high dimensions and cases in which one wishes to avoid strong assumptions about the dependence structure. Commonly used approaches rely on the incorporation of latent Gaussian random variables or parametric latent class models. The goal of this article is to develop a nonparametric Bayes approach, which defines a prior with full support on the space of distributions for multiple unordered categorical variables. This support condition ensures that we are not restricting the dependence structure a priori. We show this can be accomplished through a Dirichlet process mixture of product multinomial distributions, which is also a convenient form for posterior computation. Methods for nonparametric testing of violations of independence are proposed, and the methods are applied to model positional dependence within transcription factor binding motifs.

  20. Geostatistical radar-raingauge combination with nonparametric correlograms: methodological considerations and application in Switzerland

    Science.gov (United States)

    Schiemann, R.; Erdin, R.; Willi, M.; Frei, C.; Berenguer, M.; Sempere-Torres, D.

    2011-05-01

    Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation

  1. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  2. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  3. Analysis of residual stresses in a long hollow cylinder

    International Nuclear Information System (INIS)

    Tokovyy, Yuriy V.; Ma, Chien-Ching

    2011-01-01

    This paper presents an analytical method for solving the axisymmetric stress problem for a long hollow cylinder subjected to locally-distributed residual (incompatible) strains. This method is based on direct integration of the equilibrium and compatibility equations, which thereby have been reduced to the set of two governing equations for two key functions with corresponding boundary and integral conditions. The governing equations were solved by making use of the Fourier integral transformation. Application of the method is illustrated with an analysis of the welding residual stresses in a butt-welded thick-walled pipe. - Highlights: → A solution to the axisymmetric stress problem for a hollow cylinder is constructed. → The cylinder is subjected to a field of locally-distributed residual strains. → The method is based on direct integration of the equilibrium equations. → An application of our solution to analysis of welding residual stresses is considered.

  4. Stochastic semi-nonparametric frontier estimation of electricity distribution networks: Application of the StoNED method in the Finnish regulatory model

    International Nuclear Information System (INIS)

    Kuosmanen, Timo

    2012-01-01

    Electricity distribution network is a prime example of a natural local monopoly. In many countries, electricity distribution is regulated by the government. Many regulators apply frontier estimation techniques such as data envelopment analysis (DEA) or stochastic frontier analysis (SFA) as an integral part of their regulatory framework. While more advanced methods that combine nonparametric frontier with stochastic error term are known in the literature, in practice, regulators continue to apply simplistic methods. This paper reports the main results of the project commissioned by the Finnish regulator for further development of the cost frontier estimation in their regulatory framework. The key objectives of the project were to integrate a stochastic SFA-style noise term to the nonparametric, axiomatic DEA-style cost frontier, and to take the heterogeneity of firms and their operating environments better into account. To achieve these objectives, a new method called stochastic nonparametric envelopment of data (StoNED) was examined. Based on the insights and experiences gained in the empirical analysis using the real data of the regulated networks, the Finnish regulator adopted the StoNED method in use from 2012 onwards.

  5. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  6. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  7. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  8. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  9. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  10. Residual Displacements‘ Progresive Analysis of the Multisupported Beam

    Directory of Open Access Journals (Sweden)

    Liudas Liepa

    2014-12-01

    Full Text Available This paper focuses on a shakedown behaviour of the ideally elasto-plastic beams system under variable repeated load. The mathematical models of the analysis problems are created using numerical methods, extremum energy principles and mathematic programming. It is shown that during the shakedown process the residual displacements vary non-monotonically. By solving analysis problem, where the load locus is being progressively expanded, it is possible to determine the upper and lower bounds of residual displacements. Suggested methods are ilustrated by solving multisupported beam example problem. The results are obtained considering principle of the small displacements.

  11. Nonparametric conditional predictive regions for time series

    NARCIS (Netherlands)

    de Gooijer, J.G.; Zerom Godefay, D.

    2000-01-01

    Several nonparametric predictors based on the Nadaraya-Watson kernel regression estimator have been proposed in the literature. They include the conditional mean, the conditional median, and the conditional mode. In this paper, we consider three types of predictive regions for these predictors — the

  12. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  13. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  14. Screen Wars, Star Wars, and Sequels: Nonparametric Reanalysis of Movie Profitability

    OpenAIRE

    W. D. Walls

    2012-01-01

    In this paper we use nonparametric statistical tools to quantify motion-picture profit. We quantify the unconditional distribution of profit, the distribution of profit conditional on stars and sequels, and we also model the conditional expectation of movie profits using a non- parametric data-driven regression model. The flexibility of the non-parametric approach accommodates the full range of possible relationships among the variables without prior specification of a functional form, thereb...

  15. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  16. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  17. Nonparametric estimation in models for unobservable heterogeneity

    OpenAIRE

    Hohmann, Daniel

    2014-01-01

    Nonparametric models which allow for data with unobservable heterogeneity are studied. The first publication introduces new estimators and their asymptotic properties for conditional mixture models. The second publication considers estimation of a function from noisy observations of its Radon transform in a Gaussian white noise model.

  18. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.; Lombard, F.

    2012-01-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal

  19. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  20. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  1. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  2. proximate and ultimate analysis of fuel pellets from oil palm residues

    African Journals Online (AJOL)

    HOD

    Keywords: Oil Palm Residues, Fuel Pellets, Proximate Analysis, Ultimate Analysis. 1. INTRODUCTION ... Pelletizing of this biomass resources into pellets is a way of ensuring a ... demand for pellets [3], and alternative feed-stocks such as palm kernel ... agro-residues, selection of the best pellets has to be made based on ...

  3. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  4. Bayesian nonparametric modeling for comparison of single-neuron firing intensities.

    Science.gov (United States)

    Kottas, Athanasios; Behseta, Sam

    2010-03-01

    We propose a fully inferential model-based approach to the problem of comparing the firing patterns of a neuron recorded under two distinct experimental conditions. The methodology is based on nonhomogeneous Poisson process models for the firing times of each condition with flexible nonparametric mixture prior models for the corresponding intensity functions. We demonstrate posterior inferences from a global analysis, which may be used to compare the two conditions over the entire experimental time window, as well as from a pointwise analysis at selected time points to detect local deviations of firing patterns from one condition to another. We apply our method on two neurons recorded from the primary motor cortex area of a monkey's brain while performing a sequence of reaching tasks.

  5. Multivariate nonparametric regression and visualization with R and applications to finance

    CERN Document Server

    Klemelä, Jussi

    2014-01-01

    A modern approach to statistical learning and its applications through visualization methods With a unique and innovative presentation, Multivariate Nonparametric Regression and Visualization provides readers with the core statistical concepts to obtain complete and accurate predictions when given a set of data. Focusing on nonparametric methods to adapt to the multiple types of data generatingmechanisms, the book begins with an overview of classification and regression. The book then introduces and examines various tested and proven visualization techniques for learning samples and functio

  6. Robust variable selection method for nonparametric differential equation models with application to nonlinear dynamic gene regulatory network analysis.

    Science.gov (United States)

    Lu, Tao

    2016-01-01

    The gene regulation network (GRN) evaluates the interactions between genes and look for models to describe the gene expression behavior. These models have many applications; for instance, by characterizing the gene expression mechanisms that cause certain disorders, it would be possible to target those genes to block the progress of the disease. Many biological processes are driven by nonlinear dynamic GRN. In this article, we propose a nonparametric differential equation (ODE) to model the nonlinear dynamic GRN. Specially, we address following questions simultaneously: (i) extract information from noisy time course gene expression data; (ii) model the nonlinear ODE through a nonparametric smoothing function; (iii) identify the important regulatory gene(s) through a group smoothly clipped absolute deviation (SCAD) approach; (iv) test the robustness of the model against possible shortening of experimental duration. We illustrate the usefulness of the model and associated statistical methods through a simulation and a real application examples.

  7. Bayesian nonparametric system reliability using sets of priors

    NARCIS (Netherlands)

    Walter, G.M.; Aslett, L.J.M.; Coolen, F.P.A.

    2016-01-01

    An imprecise Bayesian nonparametric approach to system reliability with multiple types of components is developed. This allows modelling partial or imperfect prior knowledge on component failure distributions in a flexible way through bounds on the functioning probability. Given component level test

  8. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  9. Research Progress on Pesticide Residue Analysis Techniques in Agro-products

    Directory of Open Access Journals (Sweden)

    HE Ze-ying

    2016-07-01

    Full Text Available There are constant occurrences of acute pesticide poisoning among consumers and pesticide residue violations in agro-products import/export trading. Pesticide residue analysis is the important way to protect the food safety and the interest of import/export enterprises. There has been a rapid development in pesticide residue analysis techniques in recent years. In this review, the research progress in the past five years were discussed in the respects of samples preparation and instrument determination. The application, modification and development of the QuEChERS method in samples preparation and the application of tandem mass spectrometry and high resolution mass spectrometry were reviewed. And the implications for the future of the field were discussed.

  10. Nonparametric test of consistency between cosmological models and multiband CMB measurements

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir [Asia Pacific Center for Theoretical Physics, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Shafieloo, Arman, E-mail: amir@apctp.org, E-mail: shafieloo@kasi.re.kr [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of)

    2015-06-01

    We present a novel approach to test the consistency of the cosmological models with multiband CMB data using a nonparametric approach. In our analysis we calibrate the REACT (Risk Estimation and Adaptation after Coordinate Transformation) confidence levels associated with distances in function space (confidence distances) based on the Monte Carlo simulations in order to test the consistency of an assumed cosmological model with observation. To show the applicability of our algorithm, we confront Planck 2013 temperature data with concordance model of cosmology considering two different Planck spectra combination. In order to have an accurate quantitative statistical measure to compare between the data and the theoretical expectations, we calibrate REACT confidence distances and perform a bias control using many realizations of the data. Our results in this work using Planck 2013 temperature data put the best fit ΛCDM model at 95% (∼ 2σ) confidence distance from the center of the nonparametric confidence set while repeating the analysis excluding the Planck 217 × 217 GHz spectrum data, the best fit ΛCDM model shifts to 70% (∼ 1σ) confidence distance. The most prominent features in the data deviating from the best fit ΛCDM model seems to be at low multipoles  18 < ℓ < 26 at greater than 2σ, ℓ ∼ 750 at ∼1 to 2σ and ℓ ∼ 1800 at greater than 2σ level. Excluding the 217×217 GHz spectrum the feature at ℓ ∼ 1800 becomes substantially less significance at ∼1 to 2σ confidence level. Results of our analysis based on the new approach we propose in this work are in agreement with other analysis done using alternative methods.

  11. Teaching Nonparametric Statistics Using Student Instrumental Values.

    Science.gov (United States)

    Anderson, Jonathan W.; Diddams, Margaret

    Nonparametric statistics are often difficult to teach in introduction to statistics courses because of the lack of real-world examples. This study demonstrated how teachers can use differences in the rankings and ratings of undergraduate and graduate values to discuss: (1) ipsative and normative scaling; (2) uses of the Mann-Whitney U-test; and…

  12. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  13. Smooth semi-nonparametric (SNP) estimation of the cumulative incidence function.

    Science.gov (United States)

    Duc, Anh Nguyen; Wolbers, Marcel

    2017-08-15

    This paper presents a novel approach to estimation of the cumulative incidence function in the presence of competing risks. The underlying statistical model is specified via a mixture factorization of the joint distribution of the event type and the time to the event. The time to event distributions conditional on the event type are modeled using smooth semi-nonparametric densities. One strength of this approach is that it can handle arbitrary censoring and truncation while relying on mild parametric assumptions. A stepwise forward algorithm for model estimation and adaptive selection of smooth semi-nonparametric polynomial degrees is presented, implemented in the statistical software R, evaluated in a sequence of simulation studies, and applied to data from a clinical trial in cryptococcal meningitis. The simulations demonstrate that the proposed method frequently outperforms both parametric and nonparametric alternatives. They also support the use of 'ad hoc' asymptotic inference to derive confidence intervals. An extension to regression modeling is also presented, and its potential and challenges are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  14. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  15. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  16. A new powerful non-parametric two-stage approach for testing multiple phenotypes in family-based association studies

    NARCIS (Netherlands)

    Lange, C; Lyon, H; DeMeo, D; Raby, B; Silverman, EK; Weiss, ST

    2003-01-01

    We introduce a new powerful nonparametric testing strategy for family-based association studies in which multiple quantitative traits are recorded and the phenotype with the strongest genetic component is not known prior to the analysis. In the first stage, using a population-based test based on the

  17. Residual Stress Analysis Based on Acoustic and Optical Methods

    Directory of Open Access Journals (Sweden)

    Sanichiro Yoshida

    2016-02-01

    Full Text Available Co-application of acoustoelasticity and optical interferometry to residual stress analysis is discussed. The underlying idea is to combine the advantages of both methods. Acoustoelasticity is capable of evaluating a residual stress absolutely but it is a single point measurement. Optical interferometry is able to measure deformation yielding two-dimensional, full-field data, but it is not suitable for absolute evaluation of residual stresses. By theoretically relating the deformation data to residual stresses, and calibrating it with absolute residual stress evaluated at a reference point, it is possible to measure residual stresses quantitatively, nondestructively and two-dimensionally. The feasibility of the idea has been tested with a butt-jointed dissimilar plate specimen. A steel plate 18.5 mm wide, 50 mm long and 3.37 mm thick is braze-jointed to a cemented carbide plate of the same dimension along the 18.5 mm-side. Acoustoelasticity evaluates the elastic modulus at reference points via acoustic velocity measurement. A tensile load is applied to the specimen at a constant pulling rate in a stress range substantially lower than the yield stress. Optical interferometry measures the resulting acceleration field. Based on the theory of harmonic oscillation, the acceleration field is correlated to compressive and tensile residual stresses qualitatively. The acoustic and optical results show reasonable agreement in the compressive and tensile residual stresses, indicating the feasibility of the idea.

  18. The nonparametric bootstrap for the current status model

    NARCIS (Netherlands)

    Groeneboom, P.; Hendrickx, K.

    2017-01-01

    It has been proved that direct bootstrapping of the nonparametric maximum likelihood estimator (MLE) of the distribution function in the current status model leads to inconsistent confidence intervals. We show that bootstrapping of functionals of the MLE can however be used to produce valid

  19. Development of residual stress analysis procedure for fitness-for-service assessment of welded structure

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Jin, Tae Eun; Dong, P.; Prager, M.

    2003-01-01

    In this study, a state of art review of existing residual stress analysis techniques and representative solutions is presented in order to develop the residual stress analysis procedure for Fitness-For-Service(FFS) assessment of welded structure. Critical issues associated with existing residual stress solutions and their treatments in performing FFS are discussed. It should be recognized that detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermomechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress analysis procedure for FFS purposes belongs to the latter category. With this in mind, both residual stress analysis techniques and their adequacy for FFS are assessed based on both literature data and analyses performed in this investigation

  20. Does the high–tech industry consistently reduce CO{sub 2} emissions? Results from nonparametric additive regression model

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Bin [School of Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Research Center of Applied Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Lin, Boqiang, E-mail: bqlin@xmu.edu.cn [Collaborative Innovation Center for Energy Economics and Energy Policy, China Institute for Studies in Energy Policy, Xiamen University, Xiamen, Fujian 361005 (China)

    2017-03-15

    China is currently the world's largest carbon dioxide (CO{sub 2}) emitter. Moreover, total energy consumption and CO{sub 2} emissions in China will continue to increase due to the rapid growth of industrialization and urbanization. Therefore, vigorously developing the high–tech industry becomes an inevitable choice to reduce CO{sub 2} emissions at the moment or in the future. However, ignoring the existing nonlinear links between economic variables, most scholars use traditional linear models to explore the impact of the high–tech industry on CO{sub 2} emissions from an aggregate perspective. Few studies have focused on nonlinear relationships and regional differences in China. Based on panel data of 1998–2014, this study uses the nonparametric additive regression model to explore the nonlinear effect of the high–tech industry from a regional perspective. The estimated results show that the residual sum of squares (SSR) of the nonparametric additive regression model in the eastern, central and western regions are 0.693, 0.054 and 0.085 respectively, which are much less those that of the traditional linear regression model (3.158, 4.227 and 7.196). This verifies that the nonparametric additive regression model has a better fitting effect. Specifically, the high–tech industry produces an inverted “U–shaped” nonlinear impact on CO{sub 2} emissions in the eastern region, but a positive “U–shaped” nonlinear effect in the central and western regions. Therefore, the nonlinear impact of the high–tech industry on CO{sub 2} emissions in the three regions should be given adequate attention in developing effective abatement policies. - Highlights: • The nonlinear effect of the high–tech industry on CO{sub 2} emissions was investigated. • The high–tech industry yields an inverted “U–shaped” effect in the eastern region. • The high–tech industry has a positive “U–shaped” nonlinear effect in other regions. • The linear impact

  1. Nonparametric Regression Estimation for Multivariate Null Recurrent Processes

    Directory of Open Access Journals (Sweden)

    Biqing Cai

    2015-04-01

    Full Text Available This paper discusses nonparametric kernel regression with the regressor being a \\(d\\-dimensional \\(\\beta\\-null recurrent process in presence of conditional heteroscedasticity. We show that the mean function estimator is consistent with convergence rate \\(\\sqrt{n(Th^{d}}\\, where \\(n(T\\ is the number of regenerations for a \\(\\beta\\-null recurrent process and the limiting distribution (with proper normalization is normal. Furthermore, we show that the two-step estimator for the volatility function is consistent. The finite sample performance of the estimate is quite reasonable when the leave-one-out cross validation method is used for bandwidth selection. We apply the proposed method to study the relationship of Federal funds rate with 3-month and 5-year T-bill rates and discover the existence of nonlinearity of the relationship. Furthermore, the in-sample and out-of-sample performance of the nonparametric model is far better than the linear model.

  2. A nonparametric empirical Bayes framework for large-scale multiple testing.

    Science.gov (United States)

    Martin, Ryan; Tokdar, Surya T

    2012-07-01

    We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.

  3. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...

  4. Residual stress analysis of drive shafts after induction hardening

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Guilherme Vieira Braga; Rocha, Alexandre da Silva; Nunes, Rafael Menezes, E-mail: lemos_gl@yahoo.com.br [Universidade Federal do Rio Grande do Sul (UFRS), Porto Algre, RS (Brazil); Hirsch, Thomas Karl [Stiftung Institut für Werkstofftechnik (IWT), Bremen (Germany)

    2014-08-15

    Typically, for automotive shafts, shape distortion manifests itself in most cases after the induction hardening by an effect known as bending. The distortion results in a boost of costs, especially due to machining parts in the hardened state to fabricate its final tolerances. In the present study, residual stress measurements were carried out on automotive drive shafts made of DIN 38B3 steel. The samples were selected in consequence of their different distortion properties by an industrial manufacturing line. One tested shaft was straightened, because of the considerable dimensional variation and the other one not. Firstly, the residual stress measurements were carried out by using a portable diffractometer, in order to avoid cutting the shafts and evaluate the original state of the stresses, and afterwards a more detailed analysis was realized by a conventional stationary diffractometer. The obtained results presented an overview of the surface residual stress profiles after induction hardening and displayed the influence of the straightening process on the redistribution of residual stresses. They also indicated that the effects of the straightening in the residual stresses cannot be neglected. (author)

  5. Residual stresses analysis of friction stir welding using one-way FSI simulation

    International Nuclear Information System (INIS)

    Kang, Sung Wook; Jang, Beom Seon; Song, Ha Cheol

    2015-01-01

    When certain mechanisms, such as plastic deformations and temperature gradients, occur and are released in a structure, stresses remain because of the shape of the structure and external constraints. These stresses are referred to as residual stresses. The base material locally expands during heating in the welding process. When the welding is completed and cooled to room temperature, the residual stresses are left at nearly the yield strength level. In the case of friction stir welding, the maximum temperature is 80% to 90% of the melting point of the materials. Thus, the residual stresses in the welding process are smaller than those in other fusion welding processes; these stresses have not been considered previously. However, friction stir welding residual stresses are sometimes measured at approximately 70% or above. These residual stresses significantly affect fatigue behavior and lifetime. The present study investigates the residual stress distributions in various welding conditions and shapes of friction stir welding. In addition, the asymmetric feature is considered in temperature and residual stress distribution. Heat transfer analysis is conducted using the commercial computational fluid dynamics program Fluent, and results are used in the finite element structural analysis with the ANSYS Multiphysics software. The calculated residual stresses are compared with experimental values using the X-ray diffraction method.

  6. Bayesian Non-Parametric Mixtures of GARCH(1,1 Models

    Directory of Open Access Journals (Sweden)

    John W. Lau

    2012-01-01

    Full Text Available Traditional GARCH models describe volatility levels that evolve smoothly over time, generated by a single GARCH regime. However, nonstationary time series data may exhibit abrupt changes in volatility, suggesting changes in the underlying GARCH regimes. Further, the number and times of regime changes are not always obvious. This article outlines a nonparametric mixture of GARCH models that is able to estimate the number and time of volatility regime changes by mixing over the Poisson-Kingman process. The process is a generalisation of the Dirichlet process typically used in nonparametric models for time-dependent data provides a richer clustering structure, and its application to time series data is novel. Inference is Bayesian, and a Markov chain Monte Carlo algorithm to explore the posterior distribution is described. The methodology is illustrated on the Standard and Poor's 500 financial index.

  7. Non-Parametric Kinetic (NPK Analysis of Thermal Oxidation of Carbon Aerogels

    Directory of Open Access Journals (Sweden)

    Azadeh Seifi

    2017-05-01

    Full Text Available In recent years, much attention has been paid to aerogel materials (especially carbon aerogels due to their potential uses in energy-related applications, such as thermal energy storage and thermal protection systems. These open cell carbon-based porous materials (carbon aerogels can strongly react with oxygen at relatively low temperatures (~ 400°C. Therefore, it is necessary to evaluate the thermal performance of carbon aerogels in view of their energy-related applications at high temperatures and under thermal oxidation conditions. The objective of this paper is to study theoretically and experimentally the oxidation reaction kinetics of carbon aerogel using the non-parametric kinetic (NPK as a powerful method. For this purpose, a non-isothermal thermogravimetric analysis, at three different heating rates, was performed on three samples each with its specific pore structure, density and specific surface area. The most significant feature of this method, in comparison with the model-free isoconversional methods, is its ability to separate the functionality of the reaction rate with the degree of conversion and temperature by the direct use of thermogravimetric data. Using this method, it was observed that the Nomen-Sempere model could provide the best fit to the data, while the temperature dependence of the rate constant was best explained by a Vogel-Fulcher relationship, where the reference temperature was the onset temperature of oxidation. Moreover, it was found from the results of this work that the assumption of the Arrhenius relation for the temperature dependence of the rate constant led to over-estimation of the apparent activation energy (up to 160 kJ/mol that was considerably different from the values (up to 3.5 kJ/mol predicted by the Vogel-Fulcher relationship in isoconversional methods

  8. USING A DEA MANAGEMENT TOOLTHROUGH A NONPARAMETRIC APPROACH: AN EXAMINATION OF URBAN-RURAL EFFECTS ON THAI SCHOOL EFFICIENCY

    Directory of Open Access Journals (Sweden)

    SANGCHAN KANTABUTRA

    2009-04-01

    Full Text Available This paper examines urban-rural effects on public upper-secondary school efficiency in northern Thailand. In the study, efficiency was measured by a nonparametric technique, data envelopment analysis (DEA. Urban-rural effects were examined through a Mann-Whitney nonparametric statistical test. Results indicate that urban schools appear to have access to and practice different production technologies than rural schools, and rural institutions appear to operate less efficiently than their urban counterparts. In addition, a sensitivity analysis, conducted to ascertain the robustness of the analytical framework, revealed the stability of urban-rural effects on school efficiency. Policy to improve school eff iciency should thus take varying geographical area differences into account, viewing rural and urban schools as different from one another. Moreover, policymakers might consider shifting existing resources from urban schools to rural schools, provided that the increase in overall rural efficiency would be greater than the decrease, if any, in the city. Future research directions are discussed.

  9. Modern nonparametric, robust and multivariate methods festschrift in honour of Hannu Oja

    CERN Document Server

    Taskinen, Sara

    2015-01-01

    Written by leading experts in the field, this edited volume brings together the latest findings in the area of nonparametric, robust and multivariate statistical methods. The individual contributions cover a wide variety of topics ranging from univariate nonparametric methods to robust methods for complex data structures. Some examples from statistical signal processing are also given. The volume is dedicated to Hannu Oja on the occasion of his 65th birthday and is intended for researchers as well as PhD students with a good knowledge of statistics.

  10. Residual symptoms and functioning in depression, does the type of residual symptom matter? A post-hoc analysis

    Directory of Open Access Journals (Sweden)

    Romera Irene

    2013-02-01

    Full Text Available Abstract Background The degrees to which residual symptoms in major depressive disorder (MDD adversely affect patient functioning is not known. This post-hoc analysis explored the association between different residual symptoms and patient functioning. Methods Patients with MDD who responded (≥50% on the 17-item Hamilton Rating Scale for Depression; HAMD-17 after 3 months of treatment (624/930 were included. Residual core mood-symptoms (HAMD-17 core symptom subscale ≥1, residual insomnia-symptoms (HAMD-17 sleep subscale ≥1, residual anxiety-symptoms (HAMD-17-anxiety subscale ≥1, residual somatic-symptoms (HAMD-17 Item 13 ≥1, pain (Visual Analogue Scale ≥30, and functioning were assessed after 3 months treatment. A stepwise logistic regression model with normal functioning (Social and Occupational Functioning Assessment Scale ≥80 as the dependent variable was used. Results After 3 months, 59.5% of patients (371/624 achieved normal functioning and 66.0% (412/624 were in remission. Residual symptom prevalence was: core mood symptoms 72%; insomnia 63%; anxiety 78%; and somatic symptoms 41%. Pain reported in 18%. Factors associated with normal functioning were absence of core mood symptoms (odds ratio [OR] 8.7; 95% confidence interval [CI], 4.6–16.7, absence of insomnia symptoms (OR 1.8; 95% CI, 1.2–2.7, episode length (4–24 weeks vs. ≥24 weeks [OR 2.0; 95% CI, 1.1–3.6] and better baseline functioning (OR 1.0; 95% CI, 1.0–1.1. A significant interaction between residual anxiety symptoms and pain was found (p = 0.0080. Conclusions Different residual symptoms are associated to different degrees with patient functioning. To achieve normal functioning, specific residual symptoms domains might be targeted for treatment.

  11. Probit vs. semi-nonparametric estimation: examining the role of disability on institutional entry for older adults.

    Science.gov (United States)

    Sharma, Andy

    2017-06-01

    The purpose of this study was to showcase an advanced methodological approach to model disability and institutional entry. Both of these are important areas to investigate given the on-going aging of the United States population. By 2020, approximately 15% of the population will be 65 years and older. Many of these older adults will experience disability and require formal care. A probit analysis was employed to determine which disabilities were associated with admission into an institution (i.e. long-term care). Since this framework imposes strong distributional assumptions, misspecification leads to inconsistent estimators. To overcome such a short-coming, this analysis extended the probit framework by employing an advanced semi-nonparamertic maximum likelihood estimation utilizing Hermite polynomial expansions. Specification tests show semi-nonparametric estimation is preferred over probit. In terms of the estimates, semi-nonparametric ratios equal 42 for cognitive difficulty, 64 for independent living, and 111 for self-care disability while probit yields much smaller estimates of 19, 30, and 44, respectively. Public health professionals can use these results to better understand why certain interventions have not shown promise. Equally important, healthcare workers can use this research to evaluate which type of treatment plans may delay institutionalization and improve the quality of life for older adults. Implications for rehabilitation With on-going global aging, understanding the association between disability and institutional entry is important in devising successful rehabilitation interventions. Semi-nonparametric is preferred to probit and shows ambulatory and cognitive impairments present high risk for institutional entry (long-term care). Informal caregiving and home-based care require further examination as forms of rehabilitation/therapy for certain types of disabilities.

  12. Residual stress analysis in carbon fiber-reinforced SiC ceramics

    International Nuclear Information System (INIS)

    Broda, M.

    1998-01-01

    Systematic residual stress analyses are reported, carried out in long-fiber reinforced SiC ceramics. The laminated C fiber /SiC matrix specimens used were prepared by polymer pyrolysis, and the structural component specimens used are industrial products. Various diffraction methods have been applied for non-destructive evaluation of residual stress fields, so as to completely detect the residual stresses and their distribution in the specimens. The residual stress fields at the surface (μm) have been measured using characteristic X-radiation and applying the sin 2 ψ method as well as the scatter vector method. For residual stress field analysis in the mass volume (cm), neutron diffraction has been applied. The stress fields in the fiber layers (approx. 250μm) have been measured as a function of their location within the laminated composite by using an energy-dispersive method and synchrotron radiation. By means of the systematic, process-accompanying residual stress and phase analyses, conclusions can be drawn as to possible approaches for optimization of fabrication parameters. (orig./CB) [de

  13. Finite Element Residual Stress Analysis of Planetary Gear Tooth

    Directory of Open Access Journals (Sweden)

    Jungang Wang

    2013-01-01

    Full Text Available A method to simulate residual stress field of planetary gear is proposed. In this method, the finite element model of planetary gear is established and divided to tooth zone and profile zone, whose different temperature field is set. The gear's residual stress simulation is realized by the thermal compression stress generated by the temperature difference. Based on the simulation, the finite element model of planetary gear train is established, the dynamic meshing process is simulated, and influence of residual stress on equivalent stress of addendum, pitch circle, and dedendum of internal and external meshing planetary gear tooth profile is analyzed, according to non-linear contact theory, thermodynamic theory, and finite element theory. The results show that the equivalent stresses of planetary gear at both meshing and nonmeshing surface are significantly and differently reduced by residual stress. The study benefits fatigue cracking analysis and dynamic optimization design of planetary gear train.

  14. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  15. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    International Nuclear Information System (INIS)

    Barat, Eric; Dautremer, Thomas

    2007-01-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on R k (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  16. International Conference on Robust Rank-Based and Nonparametric Methods

    CERN Document Server

    McKean, Joseph

    2016-01-01

    The contributors to this volume include many of the distinguished researchers in this area. Many of these scholars have collaborated with Joseph McKean to develop underlying theory for these methods, obtain small sample corrections, and develop efficient algorithms for their computation. The papers cover the scope of the area, including robust nonparametric rank-based procedures through Bayesian and big data rank-based analyses. Areas of application include biostatistics and spatial areas. Over the last 30 years, robust rank-based and nonparametric methods have developed considerably. These procedures generalize traditional Wilcoxon-type methods for one- and two-sample location problems. Research into these procedures has culminated in complete analyses for many of the models used in practice including linear, generalized linear, mixed, and nonlinear models. Settings are both multivariate and univariate. With the development of R packages in these areas, computation of these procedures is easily shared with r...

  17. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  18. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  19. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  20. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    Science.gov (United States)

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and

  1. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  2. Seismic Signal Compression Using Nonparametric Bayesian Dictionary Learning via Clustering

    Directory of Open Access Journals (Sweden)

    Xin Tian

    2017-06-01

    Full Text Available We introduce a seismic signal compression method based on nonparametric Bayesian dictionary learning method via clustering. The seismic data is compressed patch by patch, and the dictionary is learned online. Clustering is introduced for dictionary learning. A set of dictionaries could be generated, and each dictionary is used for one cluster’s sparse coding. In this way, the signals in one cluster could be well represented by their corresponding dictionaries. A nonparametric Bayesian dictionary learning method is used to learn the dictionaries, which naturally infers an appropriate dictionary size for each cluster. A uniform quantizer and an adaptive arithmetic coding algorithm are adopted to code the sparse coefficients. With comparisons to other state-of-the art approaches, the effectiveness of the proposed method could be validated in the experiments.

  3. Radionuclides in Bayer process residues: previous analysis for radiological protection

    International Nuclear Information System (INIS)

    Cuccia, Valeria; Rocha, Zildete; Oliveira, Arno H. de

    2011-01-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in the main residues of Bayer process for alumina production: red mud and sand samples. Usually, the residues of Bayer process are named red mud, in their totality. However, in the industry where the samples were collected, there is an additional residues separation: sand and red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and neutron activation analysis. The concentrations of radionuclides are higher in the red mud than in the sand. These solid residues present activities concentrations enhanced, when compared to bauxite. Further uses for the residues as building material must be more evaluated from the radiological point of view, due to its potential of radiological exposure enhancement, specially caused by radon emission. (author)

  4. Analysis of residual swirl in tangentially-fired natural gas-boiler

    International Nuclear Information System (INIS)

    Hasril Hasini; Muhammad Azlan Muad; Mohd Zamri Yusoff; Norshah Hafeez Shuaib

    2010-01-01

    This paper describes the investigation on residual swirl flow in a 120 MW natural gas, full-scale, tangential-fired boiler. Emphasis is given towards the understanding of the behavior of the combustion gas flow pattern and temperature distribution as a result of the tangential firing system of the boiler. The analysis was carried out based on three-dimensional computational modeling on full scale boiler with validation from key design parameter as well as practical observation. Actual operating parameters of the actual boiler are taken as the boundary conditions for this modeling. The prediction of total heat flux was found to be in agreement with the key design parameter while the residual swirl predicted at the upper furnace agrees qualitatively with the practical observation. Based on this comparison, detail analysis was carried out for comprehensive understanding on the generation and destruction of the residual swirl behavior in boiler especially those with high capacity. (author)

  5. Nonparametric modeling of dynamic functional connectivity in fmri data

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Madsen, Kristoffer H.; Røge, Rasmus

    2015-01-01

    dynamic changes. The existing approaches modeling dynamic connectivity have primarily been based on time-windowing the data and k-means clustering. We propose a nonparametric generative model for dynamic FC in fMRI that does not rely on specifying window lengths and number of dynamic states. Rooted...

  6. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  7. Analysis of Residual Stress and Deformation of Rolling Strengthen Crankshaft Fillet

    Directory of Open Access Journals (Sweden)

    Han Shaojun

    2016-01-01

    Full Text Available Based on the analysis of crankshaft fillet rolling process, used ANSYS finite element analysis software to conduct the elastic-plastic mechanical simulation of crankshaft rolling process, and gained the variation law of the residual stress and plastic deformation in the radial path of the fillet under different rolling laps and rolling pressure. Established the relationship between the rolling pressure and the plastic deformation and residual stress of the fillet, and provided theoretical support for the evaluation and detection of the crankshaft rolling quality.

  8. Mapping residual stresses in PbWO4 crystals using photo-elastic analysis

    International Nuclear Information System (INIS)

    Lebeau, M.; Gobbi, L.; Majni, G.; Paone, N.; Pietroni, P.; Rinaldi, D.

    2005-01-01

    Large scintillating crystals are affected by internal stresses induced by the crystal growth temperature gradient remanence. Cutting boules (ingots) into finished crystal shapes allows for a partial tension relaxation but residual stresses remain the main cause of breaking. Quality control of residual stresses is essential in the application of Scintillating Crystals to high-energy physics calorimeters (e.g. CMS ECAL at CERN LHC). In this context the industrial process optimisation towards stress reduction is mandatory. We propose a fast technique for testing samples during the production process in order to evaluate the residual stress distribution after the first phases of mechanical processing. We mapped the stress distribution in PbWO 4 slabs cut from the same production boule. The analysis technique is based on the stress intensity determination using the photo-elastic properties of the samples. The stress distribution is mapped in each sample. The analysis shows that there are regions of high residual tension close to the seed position and at the boule periphery. These results should allow for adapting the industrial process to producing crystals with lower residual stresses

  9. Joining U.S. NRC international round robin for weld residual stress analysis. Stress analysis and validation in PWSCC mitigation program

    International Nuclear Information System (INIS)

    Maekawa, Akira; Serizawa, Hisashi; Murakawa, Hidekazu

    2012-01-01

    It is necessary to establish properly reliable weld residual stress analysis methods for accurate crack initiation and growth assessment of primary water stress corrosion cracking (PWSCC), which may occur in nickel-based dissimilar metal welds in pressurized water reactors. The U.S. Nuclear Regulatory Commission conducted an international round robin for weld residual stress analysis to improve stress analysis methods and to examine the uncertainties involved in the calculated stress values. In this paper, the results from the authors' participation in the round robin were reported. In the round robin, the weld residual stress in a nickel-based dissimilar metal weld of a pressurizer surge nozzle mock-up was computed under various analysis conditions. Based on these residual stress analysis results, a welding simulation code currently being developed that uses the iterative substructure method was validated and affecting factors on the analysis results were identified. (author)

  10. Uncertainty in decision models analyzing cost-effectiveness : The joint distribution of incremental costs and effectiveness evaluated with a nonparametric bootstrap method

    NARCIS (Netherlands)

    Hunink, Maria; Bult, J.R.; De Vries, J; Weinstein, MC

    1998-01-01

    Purpose. To illustrate the use of a nonparametric bootstrap method in the evaluation of uncertainty in decision models analyzing cost-effectiveness. Methods. The authors reevaluated a previously published cost-effectiveness analysis that used a Markov model comparing initial percutaneous

  11. Nonparametric method for failures diagnosis in the actuating subsystem of aircraft control system

    Science.gov (United States)

    Terentev, M. N.; Karpenko, S. S.; Zybin, E. Yu; Kosyanchuk, V. V.

    2018-02-01

    In this paper we design a nonparametric method for failures diagnosis in the aircraft control system that uses the measurements of the control signals and the aircraft states only. It doesn’t require a priori information of the aircraft model parameters, training or statistical calculations, and is based on analytical nonparametric one-step-ahead state prediction approach. This makes it possible to predict the behavior of unidentified and failure dynamic systems, to weaken the requirements to control signals, and to reduce the diagnostic time and problem complexity.

  12. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  13. Calculation method for residual stress analysis of filament-wound spherical pressure vessels

    International Nuclear Information System (INIS)

    Knight, C.E. Jr.

    1976-01-01

    Filament wound spherical pressure vessels may be produced with very high performance factors. These performance factors are a calculation of contained pressure times enclosed volume divided by structure weight. A number of parameters are important in determining the level of performance achieved. One of these is the residual stress state in the fabricated unit. A significant level of an unfavorable residual stress state could seriously impair the performance of the vessel. Residual stresses are of more concern for vessels with relatively thick walls and/or vessels constructed with the highly anisotropic graphite or aramid fibers. A method is established for measuring these stresses. A theoretical model of the composite structure is required. Data collection procedures and techniques are developed. The data are reduced by means of the model and result in the residual stress analysis. The analysis method can be used in process parameter studies to establish the best fabrication procedures

  14. An adjoint method of sensitivity analysis for residual vibrations of structures subject to impacts

    Science.gov (United States)

    Yan, Kun; Cheng, Gengdong

    2018-03-01

    For structures subject to impact loads, the residual vibration reduction is more and more important as the machines become faster and lighter. An efficient sensitivity analysis of residual vibration with respect to structural or operational parameters is indispensable for using a gradient based optimization algorithm, which reduces the residual vibration in either active or passive way. In this paper, an integrated quadratic performance index is used as the measure of the residual vibration, since it globally measures the residual vibration response and its calculation can be simplified greatly with Lyapunov equation. Several sensitivity analysis approaches for performance index were developed based on the assumption that the initial excitations of residual vibration were given and independent of structural design. Since the resulting excitations by the impact load often depend on structural design, this paper aims to propose a new efficient sensitivity analysis method for residual vibration of structures subject to impacts to consider the dependence. The new method is developed by combining two existing methods and using adjoint variable approach. Three numerical examples are carried out and demonstrate the accuracy of the proposed method. The numerical results show that the dependence of initial excitations on structural design variables may strongly affects the accuracy of sensitivities.

  15. Finite element analysis of residual stress in plasma-sprayed ceramic

    International Nuclear Information System (INIS)

    Mullen, R.L.; Hendricks, R.C.; McDonald, G.

    1985-01-01

    Residual stress in a ZrO 2 -Y 2 O 3 ceramic coating resulting from the plasma spraying operation is calculated. The calculations were done using the finite element method. Both thermal and mechanical analysis were performed. The resulting residual stress field was compared to the measurements obtained by Hendricks and McDonald. Reasonable agreement between the predicted and measured moment occurred. However, the resulting stress field is not in pure bending

  16. The application of non-parametric statistical method for an ALARA implementation

    International Nuclear Information System (INIS)

    Cho, Young Ho; Herr, Young Hoi

    2003-01-01

    The cost-effective reduction of Occupational Radiation Dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data

  17. Nonparametric estimation of benchmark doses in environmental risk assessment

    Science.gov (United States)

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  18. Spurious Seasonality Detection: A Non-Parametric Test Proposal

    Directory of Open Access Journals (Sweden)

    Aurelio F. Bariviera

    2018-01-01

    Full Text Available This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies.

  19. Nonparametric NAR-ARCH Modelling of Stock Prices by the Kernel Methodology

    Directory of Open Access Journals (Sweden)

    Mohamed Chikhi

    2018-02-01

    Full Text Available This paper analyses cyclical behaviour of Orange stock price listed in French stock exchange over 01/03/2000 to 02/02/2017 by testing the nonlinearities through a class of conditional heteroscedastic nonparametric models. The linearity and Gaussianity assumptions are rejected for Orange Stock returns and informational shocks have transitory effects on returns and volatility. The forecasting results show that Orange stock prices are short-term predictable and nonparametric NAR-ARCH model has better performance over parametric MA-APARCH model for short horizons. Plus, the estimates of this model are also better comparing to the predictions of the random walk model. This finding provides evidence for weak form of inefficiency in Paris stock market with limited rationality, thus it emerges arbitrage opportunities.

  20. On the robust nonparametric regression estimation for a functional regressor

    OpenAIRE

    Azzedine , Nadjia; Laksaci , Ali; Ould-Saïd , Elias

    2009-01-01

    On the robust nonparametric regression estimation for a functional regressor correspondance: Corresponding author. (Ould-Said, Elias) (Azzedine, Nadjia) (Laksaci, Ali) (Ould-Said, Elias) Departement de Mathematiques--> , Univ. Djillali Liabes--> , BP 89--> , 22000 Sidi Bel Abbes--> - ALGERIA (Azzedine, Nadjia) Departement de Mathema...

  1. Potential ligand-binding residues in rat olfactory receptors identified by correlated mutation analysis

    Science.gov (United States)

    Singer, M. S.; Oliveira, L.; Vriend, G.; Shepherd, G. M.

    1995-01-01

    A family of G-protein-coupled receptors is believed to mediate the recognition of odor molecules. In order to identify potential ligand-binding residues, we have applied correlated mutation analysis to receptor sequences from the rat. This method identifies pairs of sequence positions where residues remain conserved or mutate in tandem, thereby suggesting structural or functional importance. The analysis supported molecular modeling studies in suggesting several residues in positions that were consistent with ligand-binding function. Two of these positions, dominated by histidine residues, may play important roles in ligand binding and could confer broad specificity to mammalian odor receptors. The presence of positive (overdominant) selection at some of the identified positions provides additional evidence for roles in ligand binding. Higher-order groups of correlated residues were also observed. Each group may interact with an individual ligand determinant, and combinations of these groups may provide a multi-dimensional mechanism for receptor diversity.

  2. Comparison of Parametric and Nonparametric Methods for Analyzing the Bias of a Numerical Model

    Directory of Open Access Journals (Sweden)

    Isaac Mugume

    2016-01-01

    Full Text Available Numerical models are presently applied in many fields for simulation and prediction, operation, or research. The output from these models normally has both systematic and random errors. The study compared January 2015 temperature data for Uganda as simulated using the Weather Research and Forecast model with actual observed station temperature data to analyze the bias using parametric (the root mean square error (RMSE, the mean absolute error (MAE, mean error (ME, skewness, and the bias easy estimate (BES and nonparametric (the sign test, STM methods. The RMSE normally overestimates the error compared to MAE. The RMSE and MAE are not sensitive to direction of bias. The ME gives both direction and magnitude of bias but can be distorted by extreme values while the BES is insensitive to extreme values. The STM is robust for giving the direction of bias; it is not sensitive to extreme values but it does not give the magnitude of bias. The graphical tools (such as time series and cumulative curves show the performance of the model with time. It is recommended to integrate parametric and nonparametric methods along with graphical methods for a comprehensive analysis of bias of a numerical model.

  3. Multi-Residue Analysis of Pesticide Residues in Crude Pollens by UPLC-MS/MS

    Directory of Open Access Journals (Sweden)

    Zhou Tong

    2016-12-01

    Full Text Available A multi-residue method for the determination of 54 pesticide residues in pollens has been developed and validated. The proposed method was applied to the analysis of 48 crude pollen samples collected from eight provinces of China. The recovery of analytes ranged from 60% to 136% with relative standard deviations (RSDs below 30%. Of the 54 targeted compounds, 19 pesticides were detected. The major detection rates of each compound were 77.1% for carbendazim, 58.3% for fenpropathrin, 56.3% for chlorpyrifos, 50.0% for fluvalinate, 31.3% for chlorbenzuron, and 29.2% for triadimefon in crude pollen samples. The maximum values of each pesticide were 4516 ng/g for carbendazim, 162.8 ng/g for fenpropathrin, 176.6 ng/g for chlorpyrifos, 316.2 ng/g for fluvalinate, 437.2 ng/g for chlorbenzuron, 79.00 ng/g for triadimefon, and so on. This study provides basis for the research on the risks to honeybee health.

  4. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  5. Residual stress analysis for engineering applications by means of neutron diffraction

    International Nuclear Information System (INIS)

    Gnaeupel-Herold, T.; Brand, P.C.; Prask, H.J.

    1999-01-01

    Residual stresses originate from spatial differences in plastic deformation, temperature, or phase distribution, introduced by manufacturing processes or during service. Engineering parts and materials experience mechanical, thermal, and chemical loads during their service, and most of these loads introduce stresses that are superimposed on the already existing residual stresses. Residual stresses can therefore limit or improve life and strength of engineering parts; knowledge and understanding of these stresses is therefore critical for optimizing strength and durability. The economic and scientific importance of neutron diffraction residual stress analysis has led to an increasing number of suitable instruments worldwide. All of the major sources due in the next several years will have instruments for the sole purpose of performing residual stress and texture measurements. Recently, a dedicated, state-of-the-art diffractometer has been installed at the National Institute of Standards and Technology reactor. It has been used for a variety of measurements on basic and engineering stress problems. Among the most prominent examples that have been investigated in collaboration with industrial and academic partners are residual stresses in rails, weldments, and plasma-sprayed coatings

  6. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    Science.gov (United States)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration

  7. Analysis of fenbendazole residues in bovine milk by ELISA.

    Science.gov (United States)

    Brandon, David L; Bates, Anne H; Binder, Ronald G; Montague, William C; Whitehand, Linda C; Barker, Steven A

    2002-10-09

    Fenbendazole residues in bovine milk were analyzed by ELISAs using two monoclonal antibodies. One monoclonal antibody (MAb 587) bound the major benzimidazole anthelmintic drugs, including fenbendazole, oxfendazole, and fenbendazole sulfone. The other (MAb 591) was more specific for fenbendazole, with 13% cross-reactivity with the sulfone and no significant binding to the sulfoxide metabolite. The limit of detection of the ELISA method in the milk matrix was 7 ppb for MAb 587 and 3 ppb for MAb 591. Fenbendazole was administered in feed, drench, and paste form to three groups of dairy cattle. Milk was collected immediately before dosing and then every 12 h for 5 days. The ELISA indicated that residue levels varied widely among individual cows in each group. Fenbendazole levels peaked at approximately 12-24 h and declined rapidly thereafter. Metabolites were detected at much higher levels than the parent compound, peaked at approximately 24-36 h, and declined gradually. Residue levels were undetectable by 72 h. The ELISA data correlated well with the total residues determined by chromatographic analysis, but the use of the two separate ELISAs did not afford an advantage over ELISA with the single, broadly reactive MAb 587. The ELISA method could be used to flag high-residue samples in on-site monitoring of fenbendazole in milk and is a potential tool for studying drug pharmacokinetics.

  8. Analysis of pesticide residues Or a needle in a barn

    International Nuclear Information System (INIS)

    Heinsen, H.; Cesio, V.

    2012-01-01

    This work is about the analysis of pesticide residues as well as the study of soil, air, water and organisms. The solvents used depend on the matrix, types of pesticides, analysis and the equipment. The chromatography engaged with mass spectrometry is one of the most used techniques.

  9. Nonparametric combinatorial sequence models.

    Science.gov (United States)

    Wauthier, Fabian L; Jordan, Michael I; Jojic, Nebojsa

    2011-11-01

    This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This article presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three biological sequence families which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution over sequence representations induced by the prior. By integrating out the posterior, our method compares favorably to leading binding predictors.

  10. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    Science.gov (United States)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  11. Effects of dating errors on nonparametric trend analyses of speleothem time series

    Directory of Open Access Journals (Sweden)

    M. Mudelsee

    2012-10-01

    Full Text Available A fundamental problem in paleoclimatology is to take fully into account the various error sources when examining proxy records with quantitative methods of statistical time series analysis. Records from dated climate archives such as speleothems add extra uncertainty from the age determination to the other sources that consist in measurement and proxy errors. This paper examines three stalagmite time series of oxygen isotopic composition (δ18O from two caves in western Germany, the series AH-1 from the Atta Cave and the series Bu1 and Bu4 from the Bunker Cave. These records carry regional information about past changes in winter precipitation and temperature. U/Th and radiocarbon dating reveals that they cover the later part of the Holocene, the past 8.6 thousand years (ka. We analyse centennial- to millennial-scale climate trends by means of nonparametric Gasser–Müller kernel regression. Error bands around fitted trend curves are determined by combining (1 block bootstrap resampling to preserve noise properties (shape, autocorrelation of the δ18O residuals and (2 timescale simulations (models StalAge and iscam. The timescale error influences on centennial- to millennial-scale trend estimation are not excessively large. We find a "mid-Holocene climate double-swing", from warm to cold to warm winter conditions (6.5 ka to 6.0 ka to 5.1 ka, with warm–cold amplitudes of around 0.5‰ δ18O; this finding is documented by all three records with high confidence. We also quantify the Medieval Warm Period (MWP, the Little Ice Age (LIA and the current warmth. Our analyses cannot unequivocally support the conclusion that current regional winter climate is warmer than that during the MWP.

  12. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  13. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  14. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  15. Hierarchical Bayesian nonparametric mixture models for clustering with variable relevance determination.

    Science.gov (United States)

    Yau, Christopher; Holmes, Chris

    2011-07-01

    We propose a hierarchical Bayesian nonparametric mixture model for clustering when some of the covariates are assumed to be of varying relevance to the clustering problem. This can be thought of as an issue in variable selection for unsupervised learning. We demonstrate that by defining a hierarchical population based nonparametric prior on the cluster locations scaled by the inverse covariance matrices of the likelihood we arrive at a 'sparsity prior' representation which admits a conditionally conjugate prior. This allows us to perform full Gibbs sampling to obtain posterior distributions over parameters of interest including an explicit measure of each covariate's relevance and a distribution over the number of potential clusters present in the data. This also allows for individual cluster specific variable selection. We demonstrate improved inference on a number of canonical problems.

  16. Effects of weld residual stresses on crack-opening area analysis of pipes for LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Dong, P.; Rahman, S.; Wilkowski, G. [and others

    1997-04-01

    This paper summarizes four different studies undertaken to evaluate the effects of weld residual stresses on the crack-opening behavior of a circumferential through-wall crack in the center of a girth weld. The effect of weld residual stress on the crack-opening-area and leak-rate analyses of a pipe is not well understood. There are no simple analyses to account for these effects, and, therefore, they are frequently neglected. The four studies involved the following efforts: (1) Full-field thermoplastic finite element residual stress analyses of a crack in the center of a girth weld, (2) A comparison of the crack-opening displacements from a full-field thermoplastic residual stress analysis with a crack-face pressure elastic stress analysis to determine the residual stress effects on the crack-opening displacement, (3) The effects of hydrostatic testing on the residual stresses and the resulting crack-opening displacement, and (4) The effect of residual stresses on crack-opening displacement with different normal operating stresses.

  17. Effects of weld residual stresses on crack-opening area analysis of pipes for LBB applications

    International Nuclear Information System (INIS)

    Dong, P.; Rahman, S.; Wilkowski, G.

    1997-01-01

    This paper summarizes four different studies undertaken to evaluate the effects of weld residual stresses on the crack-opening behavior of a circumferential through-wall crack in the center of a girth weld. The effect of weld residual stress on the crack-opening-area and leak-rate analyses of a pipe is not well understood. There are no simple analyses to account for these effects, and, therefore, they are frequently neglected. The four studies involved the following efforts: (1) Full-field thermoplastic finite element residual stress analyses of a crack in the center of a girth weld, (2) A comparison of the crack-opening displacements from a full-field thermoplastic residual stress analysis with a crack-face pressure elastic stress analysis to determine the residual stress effects on the crack-opening displacement, (3) The effects of hydrostatic testing on the residual stresses and the resulting crack-opening displacement, and (4) The effect of residual stresses on crack-opening displacement with different normal operating stresses

  18. A Nonparametric Test for Seasonal Unit Roots

    OpenAIRE

    Kunst, Robert M.

    2009-01-01

    Abstract: We consider a nonparametric test for the null of seasonal unit roots in quarterly time series that builds on the RUR (records unit root) test by Aparicio, Escribano, and Sipols. We find that the test concept is more promising than a formalization of visual aids such as plots by quarter. In order to cope with the sensitivity of the original RUR test to autocorrelation under its null of a unit root, we suggest an augmentation step by autoregression. We present some evidence on the siz...

  19. Developing an immigration policy for Germany on the basis of a nonparametric labor market classification

    OpenAIRE

    Froelich, Markus; Puhani, Patrick

    2004-01-01

    Based on a nonparametrically estimated model of labor market classifications, this paper makes suggestions for immigration policy using data from western Germany in the 1990s. It is demonstrated that nonparametric regression is feasible in higher dimensions with only a few thousand observations. In sum, labor markets able to absorb immigrants are characterized by above average age and by professional occupations. On the other hand, labor markets for young workers in service occupations are id...

  20. Experimental analysis of residual stresses in pre-straightened SAE 1045 steel

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, Carla Adriana Theis Soares; Rocha, Alexandre da Silva, E-mail: carla.adriana@ufrgs.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Laboratorio de Formacao de Metais; Epp, Jérémy; Zoch, Hans-Werner [Stiftung Institut für Werkstofftechnik IWT, University of Bremen (Germany)

    2017-11-15

    This paper aims at analyzing the effects of the roller pre-straightening of wire-rods on residual stress distributions in SAE 1045 steel bars. The combined drawing process is used in industrial production of bars in order to obtain a good surface quality and improved mechanical properties complying with specifications of the final products. In this process, prior to the drawing step, a roller straightening of the steel wire-rod is essential, because it provides the minimum straightness necessary for drawing. Metallographic analysis and hardness test were done for selected samples after different processing steps. Also, residual stress analysis of pre-straightened wire-rods by X-ray diffraction and neutron diffraction were carried out. The hardness tests show higher values near the surface and lower in the center of the wire-rod. Besides, the residual stresses results show a big inhomogeneity from one peripheral position to another and also in the evaluated cross section. (author)

  1. Experimental analysis of residual stresses in pre-straightened SAE 1045 steel

    International Nuclear Information System (INIS)

    Diehl, Carla Adriana Theis Soares; Rocha, Alexandre da Silva

    2017-01-01

    This paper aims at analyzing the effects of the roller pre-straightening of wire-rods on residual stress distributions in SAE 1045 steel bars. The combined drawing process is used in industrial production of bars in order to obtain a good surface quality and improved mechanical properties complying with specifications of the final products. In this process, prior to the drawing step, a roller straightening of the steel wire-rod is essential, because it provides the minimum straightness necessary for drawing. Metallographic analysis and hardness test were done for selected samples after different processing steps. Also, residual stress analysis of pre-straightened wire-rods by X-ray diffraction and neutron diffraction were carried out. The hardness tests show higher values near the surface and lower in the center of the wire-rod. Besides, the residual stresses results show a big inhomogeneity from one peripheral position to another and also in the evaluated cross section. (author)

  2. A comparative study of non-parametric models for identification of ...

    African Journals Online (AJOL)

    However, the frequency response method using random binary signals was good for unpredicted white noise characteristics and considered the best method for non-parametric system identifica-tion. The autoregressive external input (ARX) model was very useful for system identification, but on applicati-on, few input ...

  3. A semi-nonparametric mixture model for selecting functionally consistent proteins.

    Science.gov (United States)

    Yu, Lianbo; Doerge, Rw

    2010-09-28

    High-throughput technologies have led to a new era of proteomics. Although protein microarray experiments are becoming more common place there are a variety of experimental and statistical issues that have yet to be addressed, and that will carry over to new high-throughput technologies unless they are investigated. One of the largest of these challenges is the selection of functionally consistent proteins. We present a novel semi-nonparametric mixture model for classifying proteins as consistent or inconsistent while controlling the false discovery rate and the false non-discovery rate. The performance of the proposed approach is compared to current methods via simulation under a variety of experimental conditions. We provide a statistical method for selecting functionally consistent proteins in the context of protein microarray experiments, but the proposed semi-nonparametric mixture model method can certainly be generalized to solve other mixture data problems. The main advantage of this approach is that it provides the posterior probability of consistency for each protein.

  4. 1st Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Lahiri, S; Politis, Dimitris

    2014-01-01

    This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world.   The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the wo...

  5. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  6. Viscoelastic finite element analysis of residual stresses in porcelain-veneered zirconia dental crowns.

    Science.gov (United States)

    Kim, Jeongho; Dhital, Sukirti; Zhivago, Paul; Kaizer, Marina R; Zhang, Yu

    2018-06-01

    The main problem of porcelain-veneered zirconia (PVZ) dental restorations is chipping and delamination of veneering porcelain owing to the development of deleterious residual stresses during the cooling phase of veneer firing. The aim of this study is to elucidate the effects of cooling rate, thermal contraction coefficient and elastic modulus on residual stresses developed in PVZ dental crowns using viscoelastic finite element methods (VFEM). A three-dimensional VFEM model has been developed to predict residual stresses in PVZ structures using ABAQUS finite element software and user subroutines. First, the newly established model was validated with experimentally measured residual stress profiles using Vickers indentation on flat PVZ specimens. An excellent agreement between the model prediction and experimental data was found. Then, the model was used to predict residual stresses in more complex anatomically-correct crown systems. Two PVZ crown systems with different thermal contraction coefficients and porcelain moduli were studied: VM9/Y-TZP and LAVA/Y-TZP. A sequential dual-step finite element analysis was performed: heat transfer analysis and viscoelastic stress analysis. Controlled and bench convection cooling rates were simulated by applying different convective heat transfer coefficients 1.7E-5 W/mm 2 °C (controlled cooling) and 0.6E-4 W/mm 2 °C (bench cooling) on the crown surfaces exposed to the air. Rigorous viscoelastic finite element analysis revealed that controlled cooling results in lower maximum stresses in both veneer and core layers for the two PVZ systems relative to bench cooling. Better compatibility of thermal contraction coefficients between porcelain and zirconia and a lower porcelain modulus reduce residual stresses in both layers. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Non-parametric estimation of the individual's utility map

    OpenAIRE

    Noguchi, Takao; Sanborn, Adam N.; Stewart, Neil

    2013-01-01

    Models of risky choice have attracted much attention in behavioural economics. Previous research has repeatedly demonstrated that individuals' choices are not well explained by expected utility theory, and a number of alternative models have been examined using carefully selected sets of choice alternatives. The model performance however, can depend on which choice alternatives are being tested. Here we develop a non-parametric method for estimating the utility map over the wide range of choi...

  8. Nonparametric Bayesian models through probit stick-breaking processes.

    Science.gov (United States)

    Rodríguez, Abel; Dunson, David B

    2011-03-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.

  9. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  10. A general approach to posterior contraction in nonparametric inverse problems

    NARCIS (Netherlands)

    Knapik, Bartek; Salomond, Jean Bernard

    In this paper, we propose a general method to derive an upper bound for the contraction rate of the posterior distribution for nonparametric inverse problems. We present a general theorem that allows us to derive contraction rates for the parameter of interest from contraction rates of the related

  11. Residual Stress Analysis for Engineering Applications by Means of Neutron Diffraction

    International Nuclear Information System (INIS)

    Gndupel-Herold, Thomas; Brand, Paul C.; Prask, Henry J.

    1999-01-01

    The economic and scientific importance of neutron diffraction residual stress analysis has led to an increasing number of suitable instruments worldwide. Recently, a dedicated state-of-the-art diffractometer has been installed at the National Institute of Standards and Technology reactor. It has been used for a variety of measurements on basic and engineering stress problems. Among the most prominent examples that have been investigated are residual stresses in rails, weldments, and plasma-sprayed coatings

  12. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  13. A BAYESIAN NONPARAMETRIC MIXTURE MODEL FOR SELECTING GENES AND GENE SUBNETWORKS.

    Science.gov (United States)

    Zhao, Yize; Kang, Jian; Yu, Tianwei

    2014-06-01

    It is very challenging to select informative features from tens of thousands of measured features in high-throughput data analysis. Recently, several parametric/regression models have been developed utilizing the gene network information to select genes or pathways strongly associated with a clinical/biological outcome. Alternatively, in this paper, we propose a nonparametric Bayesian model for gene selection incorporating network information. In addition to identifying genes that have a strong association with a clinical outcome, our model can select genes with particular expressional behavior, in which case the regression models are not directly applicable. We show that our proposed model is equivalent to an infinity mixture model for which we develop a posterior computation algorithm based on Markov chain Monte Carlo (MCMC) methods. We also propose two fast computing algorithms that approximate the posterior simulation with good accuracy but relatively low computational cost. We illustrate our methods on simulation studies and the analysis of Spellman yeast cell cycle microarray data.

  14. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    Science.gov (United States)

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  15. A Bayesian Beta-Mixture Model for Nonparametric IRT (BBM-IRT)

    Science.gov (United States)

    Arenson, Ethan A.; Karabatsos, George

    2017-01-01

    Item response models typically assume that the item characteristic (step) curves follow a logistic or normal cumulative distribution function, which are strictly monotone functions of person test ability. Such assumptions can be overly-restrictive for real item response data. We propose a simple and more flexible Bayesian nonparametric IRT model…

  16. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  17. Application of nonparametric statistics to material strength/reliability assessment

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-01-01

    An advanced material technology requires data base on a wide variety of material behavior which need to be established experimentally. It may often happen that experiments are practically limited in terms of reproducibility or a range of test parameters. Statistical methods can be applied to understanding uncertainties in such a quantitative manner as required from the reliability point of view. Statistical assessment involves determinations of a most probable value and the maximum and/or minimum value as one-sided or two-sided confidence limit. A scatter of test data can be approximated by a theoretical distribution only if the goodness of fit satisfies a test criterion. Alternatively, nonparametric statistics (NPS) or distribution-free statistics can be applied. Mathematical procedures by NPS are well established for dealing with most reliability problems. They handle only order statistics of a sample. Mathematical formulas and some applications to engineering assessments are described. They include confidence limits of median, population coverage of sample, required minimum number of a sample, and confidence limits of fracture probability. These applications demonstrate that a nonparametric statistical estimation is useful in logical decision making in the case a large uncertainty exists. (author)

  18. Exact nonparametric inference for detection of nonlinear determinism

    OpenAIRE

    Luo, Xiaodong; Zhang, Jie; Small, Michael; Moroz, Irene

    2005-01-01

    We propose an exact nonparametric inference scheme for the detection of nonlinear determinism. The essential fact utilized in our scheme is that, for a linear stochastic process with jointly symmetric innovations, its ordinary least square (OLS) linear prediction error is symmetric about zero. Based on this viewpoint, a class of linear signed rank statistics, e.g. the Wilcoxon signed rank statistic, can be derived with the known null distributions from the prediction error. Thus one of the ad...

  19. Texture, residual stress and structural analysis of thin films using a combined X-ray analysis

    International Nuclear Information System (INIS)

    Lutterotti, L.; Chateigner, D.; Ferrari, S.; Ricote, J.

    2004-01-01

    Advanced thin films for today's industrial and research needs require highly specialized methodologies for a successful quantitative characterization. In particular, in the case of multilayer and/or unknown phases a global approach is necessary to obtain some or all the required information. A full approach has been developed integrating novel texture and residual stress methodologies with the Rietveld method (Acta Cryst. 22 (1967) 151) (for crystal structure analysis) and it has been coupled with the reflectivity analysis. The complete analysis can be done at once and offers several benefits: the thicknesses obtained from reflectivity can be used to correct the diffraction spectra, the phase analysis help to identify the layers and to determine the electron density profile for reflectivity; quantitative texture is needed for quantitative phase and residual stress analyses; crystal structure determination benefits of the previous. To achieve this result, it was necessary to develop some new methods, especially for texture and residual stresses. So it was possible to integrate them in the Rietveld, full profile fitting of the patterns. The measurement of these spectra required a special reflectometer/diffractometer that combines a thin parallel beam (for reflectivity) and a texture/stress goniometer with a curved large position sensitive detector. This new diffraction/reflectivity X-ray machine has been used to test the combined approach. Several spectra and the reflectivity patterns have been collected at different tilting angles and processed at once by the special software incorporating the aforementioned methodologies. Some analysis examples will be given to show the possibilities offered by the method

  20. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis

    Directory of Open Access Journals (Sweden)

    Yushen Du

    2016-11-01

    Full Text Available Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp, we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available.

  1. Promotion time cure rate model with nonparametric form of covariate effects.

    Science.gov (United States)

    Chen, Tianlei; Du, Pang

    2018-05-10

    Survival data with a cured portion are commonly seen in clinical trials. Motivated from a biological interpretation of cancer metastasis, promotion time cure model is a popular alternative to the mixture cure rate model for analyzing such data. The existing promotion cure models all assume a restrictive parametric form of covariate effects, which can be incorrectly specified especially at the exploratory stage. In this paper, we propose a nonparametric approach to modeling the covariate effects under the framework of promotion time cure model. The covariate effect function is estimated by smoothing splines via the optimization of a penalized profile likelihood. Point-wise interval estimates are also derived from the Bayesian interpretation of the penalized profile likelihood. Asymptotic convergence rates are established for the proposed estimates. Simulations show excellent performance of the proposed nonparametric method, which is then applied to a melanoma study. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  3. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  4. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  5. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems.

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  6. Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Science.gov (United States)

    Gray-Davies, Tristan; Holmes, Chris C.; Caron, François

    2018-01-01

    We present a novel Bayesian nonparametric regression model for covariates X and continuous response variable Y ∈ ℝ. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F (y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates. PMID:29623150

  7. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Science.gov (United States)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  8. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  9. Nonparametric statistics a step-by-step approach

    CERN Document Server

    Corder, Gregory W

    2014-01-01

    "…a very useful resource for courses in nonparametric statistics in which the emphasis is on applications rather than on theory.  It also deserves a place in libraries of all institutions where introductory statistics courses are taught."" -CHOICE This Second Edition presents a practical and understandable approach that enhances and expands the statistical toolset for readers. This book includes: New coverage of the sign test and the Kolmogorov-Smirnov two-sample test in an effort to offer a logical and natural progression to statistical powerSPSS® (Version 21) software and updated screen ca

  10. A structural nonparametric reappraisal of the CO2 emissions-income relationship

    NARCIS (Netherlands)

    Azomahou, T.T.; Goedhuys - Degelin, Micheline; Nguyen-Van, P.

    Relying on a structural nonparametric estimation, we show that co2 emissions clearly increase with income at low income levels. For higher income levels, we observe a decreasing relationship, though not significant. We also find thatco2 emissions monotonically increases with energy use at a

  11. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  12. Transformation-invariant and nonparametric monotone smooth estimation of ROC curves.

    Science.gov (United States)

    Du, Pang; Tang, Liansheng

    2009-01-30

    When a new diagnostic test is developed, it is of interest to evaluate its accuracy in distinguishing diseased subjects from non-diseased subjects. The accuracy of the test is often evaluated by receiver operating characteristic (ROC) curves. Smooth ROC estimates are often preferable for continuous test results when the underlying ROC curves are in fact continuous. Nonparametric and parametric methods have been proposed by various authors to obtain smooth ROC curve estimates. However, there are certain drawbacks with the existing methods. Parametric methods need specific model assumptions. Nonparametric methods do not always satisfy the inherent properties of the ROC curves, such as monotonicity and transformation invariance. In this paper we propose a monotone spline approach to obtain smooth monotone ROC curves. Our method ensures important inherent properties of the underlying ROC curves, which include monotonicity, transformation invariance, and boundary constraints. We compare the finite sample performance of the newly proposed ROC method with other ROC smoothing methods in large-scale simulation studies. We illustrate our method through a real life example. Copyright (c) 2008 John Wiley & Sons, Ltd.

  13. Impulse response identification with deterministic inputs using non-parametric methods

    International Nuclear Information System (INIS)

    Bhargava, U.K.; Kashyap, R.L.; Goodman, D.M.

    1985-01-01

    This paper addresses the problem of impulse response identification using non-parametric methods. Although the techniques developed herein apply to the truncated, untruncated, and the circulant models, we focus on the truncated model which is useful in certain applications. Two methods of impulse response identification will be presented. The first is based on the minimization of the C/sub L/ Statistic, which is an estimate of the mean-square prediction error; the second is a Bayesian approach. For both of these methods, we consider the effects of using both the identity matrix and the Laplacian matrix as weights on the energy in the impulse response. In addition, we present a method for estimating the effective length of the impulse response. Estimating the length is particularly important in the truncated case. Finally, we develop a method for estimating the noise variance at the output. Often, prior information on the noise variance is not available, and a good estimate is crucial to the success of estimating the impulse response with a nonparametric technique

  14. Analysis of lifetime and residual strength of wood

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1997-01-01

    The present paper is thought of as a working paper for the CTBA-seminar on Thematic network in the field for reliability based design of timber structures, Topic: Numerical methods for structural analysis. It is preliminary and quite informal in its structure. The intention is to present some wood...... technological problems which can be solved with respect to lifetime and residual strength by the so-called DVM-theory (Damaged Viscoelastic Material). The outlines of the paper is straight on: Expressions are presented by which the analysis is made. then some examples are considered with solutions presented...

  15. Analysis of residual stress in subsurface layers after precision hard machining of forging tools

    Directory of Open Access Journals (Sweden)

    Czan Andrej

    2018-01-01

    Full Text Available This paper is focused on analysis of residual stress of functional surfaces and subsurface layers created by precision technologies of hard machining for progressive constructional materials of forging tools. Methods of experiments are oriented on monitoring of residual stress in surface which is created by hard turning (roughing and finishing operations. Subsequently these surfaces were etched in thin layers by electro-chemical polishing. The residual stress was monitored in each etched layer. The measuring was executed by portable X-ray diffractometer for detection of residual stress and structural phases. The results significantly indicate rise and distribution of residual stress in surface and subsurface layers and their impact on functional properties of surface integrity.

  16. Reverse Conservation Analysis Reveals the Specificity Determining Residues of Cytochrome P450 Family 2 (CYP 2

    Directory of Open Access Journals (Sweden)

    Tai-Sung Lee

    2008-01-01

    Full Text Available The concept of conservation of amino acids is widely used to identify important alignment positions of orthologs. The assumption is that important amino acid residues will be conserved in the protein family during the evolutionary process. For paralog alignment, on the other hand, the opposite concept can be used to identify residues that are responsible for specificity. Assuming that the function-specific or ligand-specific residue positions will have higher diversity since they are under evolutionary pressure to fit the target specificity, these function-specific or ligand-specific residues positions will have a lower degree of conservation than other positions in a highly conserved paralog alignment. This study assessed the ability of reverse conservation analysis to identify function-specific and ligand-specific residue positions in closely related paralog. Reverse conservation analysis of paralog alignments successfully identified all six previously reported substrate recognition sites (SRSs in cytochrome P450 family 2 (CYP 2. Further analysis of each subfamily identified the specificity-determining residues (SDRs that have been experimentally found. New potential SDRs were also predicted and await confirmation by further experiments or modeling calculations. This concept may be also applied to identify SDRs in other protein families.

  17. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  18. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  19. Assessing pupil and school performance by non-parametric and parametric techniques

    NARCIS (Netherlands)

    de Witte, K.; Thanassoulis, E.; Simpson, G.; Battisti, G.; Charlesworth-May, A.

    2010-01-01

    This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall

  20. Supremum Norm Posterior Contraction and Credible Sets for Nonparametric Multivariate Regression

    NARCIS (Netherlands)

    Yoo, W.W.; Ghosal, S

    2016-01-01

    In the setting of nonparametric multivariate regression with unknown error variance, we study asymptotic properties of a Bayesian method for estimating a regression function f and its mixed partial derivatives. We use a random series of tensor product of B-splines with normal basis coefficients as a

  1. A non-parametric hierarchical model to discover behavior dynamics from tracks

    NARCIS (Netherlands)

    Kooij, J.F.P.; Englebienne, G.; Gavrila, D.M.

    2012-01-01

    We present a novel non-parametric Bayesian model to jointly discover the dynamics of low-level actions and high-level behaviors of tracked people in open environments. Our model represents behaviors as Markov chains of actions which capture high-level temporal dynamics. Actions may be shared by

  2. On the use of permutation in and the performance of a class of nonparametric methods to detect differential gene expression.

    Science.gov (United States)

    Pan, Wei

    2003-07-22

    Recently a class of nonparametric statistical methods, including the empirical Bayes (EB) method, the significance analysis of microarray (SAM) method and the mixture model method (MMM), have been proposed to detect differential gene expression for replicated microarray experiments conducted under two conditions. All the methods depend on constructing a test statistic Z and a so-called null statistic z. The null statistic z is used to provide some reference distribution for Z such that statistical inference can be accomplished. A common way of constructing z is to apply Z to randomly permuted data. Here we point our that the distribution of z may not approximate the null distribution of Z well, leading to possibly too conservative inference. This observation may apply to other permutation-based nonparametric methods. We propose a new method of constructing a null statistic that aims to estimate the null distribution of a test statistic directly. Using simulated data and real data, we assess and compare the performance of the existing method and our new method when applied in EB, SAM and MMM. Some interesting findings on operating characteristics of EB, SAM and MMM are also reported. Finally, by combining the idea of SAM and MMM, we outline a simple nonparametric method based on the direct use of a test statistic and a null statistic.

  3. On Wasserstein Two-Sample Testing and Related Families of Nonparametric Tests

    Directory of Open Access Journals (Sweden)

    Aaditya Ramdas

    2017-01-01

    Full Text Available Nonparametric two-sample or homogeneity testing is a decision theoretic problem that involves identifying differences between two random variables without making parametric assumptions about their underlying distributions. The literature is old and rich, with a wide variety of statistics having being designed and analyzed, both for the unidimensional and the multivariate setting. Inthisshortsurvey,wefocusonteststatisticsthatinvolvetheWassersteindistance. Usingan entropic smoothing of the Wasserstein distance, we connect these to very different tests including multivariate methods involving energy statistics and kernel based maximum mean discrepancy and univariate methods like the Kolmogorov–Smirnov test, probability or quantile (PP/QQ plots and receiver operating characteristic or ordinal dominance (ROC/ODC curves. Some observations are implicit in the literature, while others seem to have not been noticed thus far. Given nonparametric two-sample testing’s classical and continued importance, we aim to provide useful connections for theorists and practitioners familiar with one subset of methods but not others.

  4. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Evaluation of residue-residue contact predictions in CASP9

    KAUST Repository

    Monastyrskyy, Bohdan

    2011-01-01

    This work presents the results of the assessment of the intramolecular residue-residue contact predictions submitted to CASP9. The methodology for the assessment does not differ from that used in previous CASPs, with two basic evaluation measures being the precision in recognizing contacts and the difference between the distribution of distances in the subset of predicted contact pairs versus all pairs of residues in the structure. The emphasis is placed on the prediction of long-range contacts (i.e., contacts between residues separated by at least 24 residues along sequence) in target proteins that cannot be easily modeled by homology. Although there is considerable activity in the field, the current analysis reports no discernable progress since CASP8.

  6. Estimation of Stochastic Volatility Models by Nonparametric Filtering

    DEFF Research Database (Denmark)

    Kanaya, Shin; Kristensen, Dennis

    2016-01-01

    /estimated volatility process replacing the latent process. Our estimation strategy is applicable to both parametric and nonparametric stochastic volatility models, and can handle both jumps and market microstructure noise. The resulting estimators of the stochastic volatility model will carry additional biases...... and variances due to the first-step estimation, but under regularity conditions we show that these vanish asymptotically and our estimators inherit the asymptotic properties of the infeasible estimators based on observations of the volatility process. A simulation study examines the finite-sample properties...

  7. Chemical Analysis of Organic Residues Found in Hellenistic Time Amphorae from SE Bulgaria

    Science.gov (United States)

    Zlateva, B.; Rangelov, M.

    2015-05-01

    We have used IR spectroscopy, 1H NMR spectroscopy, high-performance liquid chromatography and thin-layer chromatography to study the composition of resin residues found in 22 amphorae from Apollonia Pontika (SE Bulgaria). In particular this analysis of the resin residues was aimed at discovering the content of the amphorae and to verify the hypothesis on the transport of wine, named "Retsina". Additionally this hypothesis has been confirmed by a similar analysis of the modern resin sample from Aleppo pine (Pinus Halepensis) growing in the Attica region (Greece).

  8. Hadron Energy Reconstruction for ATLAS Barrel Combined Calorimeter Using Non-Parametrical Method

    CERN Document Server

    Kulchitskii, Yu A

    2000-01-01

    Hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter in the framework of the non-parametrical method is discussed. The non-parametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to fast energy reconstruction in a first level trigger. The reconstructed mean values of the hadron energies are within \\pm1% of the true values and the fractional energy resolution is [(58\\pm 3)%{\\sqrt{GeV}}/\\sqrt{E}+(2.5\\pm0.3)%]\\bigoplus(1.7\\pm0.2) GeV/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74\\pm0.04. Results of a study of the longitudinal hadronic shower development are also presented.

  9. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  10. Neutron activation analysis for noble metals in matte leach residues

    International Nuclear Information System (INIS)

    Hart, R.J.

    1978-01-01

    The development of the neutron activation analysis technique as a method for rapid and precise determinations of platinum group metals in matte leach residues depends on obtaining a method for effecting complete and homogeneous sample dilution. A simple method for solid dilution of metal samples is outlined in this study, which provided a basis for the accurate determination of all the noble metals by the Neutron Activation Analysis technique

  11. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    International Nuclear Information System (INIS)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David; Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Buchhave, Lars A.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  12. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C. [Astronomy Department, University of Florida, 211 Bryant Space Sciences Center, Gainesville, FL 32611 (United States); Fabrycky, Daniel C. [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Steffen, Jason H. [Fermilab Center for Particle Astrophysics, P.O. Box 500, MS 127, Batavia, IL 60510 (United States); Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Welsh, William F. [Astronomy Department, San Diego State University, San Diego, CA 92182-1221 (United States); Allen, Christopher [Orbital Sciences Corporation/NASA Ames Research Center, Moffett Field, CA 94035 (United States); Batalha, Natalie M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192 (United States); Buchhave, Lars A., E-mail: eford@astro.ufl.edu [Niels Bohr Institute, Copenhagen University, DK-2100 Copenhagen (Denmark); Collaboration: Kepler Science Team; and others

    2012-05-10

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  13. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  14. TOF-SIMS Analysis of Crater Residues from Wild 2 Cometary on Stardust Aluminum Foil

    Science.gov (United States)

    Leutner, Jan; Stephan, Thomas; Kearsley, T.; Horz, Friedrich; Flynn, George J.; Sandford, Scott A.

    2006-01-01

    Impact residues of cometary particles on aluminum foils from the Stardust mission were investigated with TOF-SIMS for their elemental and organic composition. The residual matter from comet 81P/Wild 2 shows a wide compositional range, from nearly monomineralic grains to polymict aggregates. Despite the comparably small analyzed sample volume, the average element composition of the investigated residues is similar to bulk CI chondritic values. Analysis of organic components in impact residues is complicated, due to fragmentation and alteration of the compounds during the impact process and by the presence of contaminants on the aluminum foils. Nevertheless, polycyclic aromatic hydrocarbons (PAHs) that are unambiguously associated with the impact residues were observed, and thus are most likely of cometary origin.

  15. Tank 16 Annulus Cleanout Analysis Doses at Seepline from Transport of Residual Tc-99 Wastes

    International Nuclear Information System (INIS)

    Collard, L.B.

    1999-01-01

    An analysis of residual Tc-99 in the Tank 16 annulus was conducted to assess the potential benefit from cleaning the annulus. One analysis was performed for the as-is case to determine seepline doses if no clean out occurs. Another analysis was performed assuming that ninety percent of existing contaminants are removed. Characterization data for samples retrieved from the annulus were used in the analysis. Only Tc-99 was analyzed because preliminary modeling identified it as the highest dose contributor. The effect of residual waste in piping was not analyzed

  16. Residual stress analysis in BWR pressure vessel attachments

    International Nuclear Information System (INIS)

    Dexter, R.J.; Leung, C.P.; Pont, D.

    1992-06-01

    Residual stresses from welding processes can be the primary driving force for stress corrosion cracking (SCC) in BWR components. Thus, a better understanding of the causes and nature of these residual stresses can help assess and remedy SCC. Numerical welding simulation software, such as SYSWELD, and material property data have been used to quantify residual stresses for application to SCC assessments in BWR components. Furthermore, parametric studies using SYSWELD have revealed which variables significantly affect predicted residual stress. Overall, numerical modeling techniques can be used to evaluate residual stress for SCC assessments of BWR components and to identify and plan future SCC research

  17. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008

    Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...... likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...

  18. Estimation from PET data of transient changes in dopamine concentration induced by alcohol: support for a non-parametric signal estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, C C; Yoder, K K; Normandin, M D; Morris, E D [Department of Radiology, Indiana University School of Medicine, Indianapolis, IN (United States); Kareken, D A [Department of Neurology, Indiana University School of Medicine, Indianapolis, IN (United States); Bouman, C A [Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN (United States); O' Connor, S J [Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN (United States)], E-mail: emorris@iupui.edu

    2008-03-07

    We previously developed a model-independent technique (non-parametric ntPET) for extracting the transient changes in neurotransmitter concentration from paired (rest and activation) PET studies with a receptor ligand. To provide support for our method, we introduced three hypotheses of validation based on work by Endres and Carson (1998 J. Cereb. Blood Flow Metab. 18 1196-210) and Yoder et al (2004 J. Nucl. Med. 45 903-11), and tested them on experimental data. All three hypotheses describe relationships between the estimated free (synaptic) dopamine curves (F{sup DA}(t)) and the change in binding potential ({delta}BP). The veracity of the F{sup DA}(t) curves recovered by nonparametric ntPET is supported when the data adhere to the following hypothesized behaviors: (1) {delta}BP should decline with increasing DA peak time, (2) {delta}BP should increase as the strength of the temporal correlation between F{sup DA}(t) and the free raclopride (F{sup RAC}(t)) curve increases, (3) {delta}BP should decline linearly with the effective weighted availability of the receptor sites. We analyzed regional brain data from 8 healthy subjects who received two [{sup 11}C]raclopride scans: one at rest, and one during which unanticipated IV alcohol was administered to stimulate dopamine release. For several striatal regions, nonparametric ntPET was applied to recover F{sup DA}(t), and binding potential values were determined. Kendall rank-correlation analysis confirmed that the F{sup DA}(t) data followed the expected trends for all three validation hypotheses. Our findings lend credence to our model-independent estimates of F{sup DA}(t). Application of nonparametric ntPET may yield important insights into how alterations in timing of dopaminergic neurotransmission are involved in the pathologies of addiction and other psychiatric disorders.

  19. Nonparametric identification of nonlinear dynamic systems using a synchronisation-based method

    Science.gov (United States)

    Kenderi, Gábor; Fidlin, Alexander

    2014-12-01

    The present study proposes an identification method for highly nonlinear mechanical systems that does not require a priori knowledge of the underlying nonlinearities to reconstruct arbitrary restoring force surfaces between degrees of freedom. This approach is based on the master-slave synchronisation between a dynamic model of the system as the slave and the real system as the master using measurements of the latter. As the model synchronises to the measurements, it becomes an observer of the real system. The optimal observer algorithm in a least-squares sense is given by the Kalman filter. Using the well-known state augmentation technique, the Kalman filter can be turned into a dual state and parameter estimator to identify parameters of a priori characterised nonlinearities. The paper proposes an extension of this technique towards nonparametric identification. A general system model is introduced by describing the restoring forces as bilateral spring-dampers with time-variant coefficients, which are estimated as augmented states. The estimation procedure is followed by an a posteriori statistical analysis to reconstruct noise-free restoring force characteristics using the estimated states and their estimated variances. Observability is provided using only one measured mechanical quantity per degree of freedom, which makes this approach less demanding in the number of necessary measurement signals compared with truly nonparametric solutions, which typically require displacement, velocity and acceleration signals. Additionally, due to the statistical rigour of the procedure, it successfully addresses signals corrupted by significant measurement noise. In the present paper, the method is described in detail, which is followed by numerical examples of one degree of freedom (1DoF) and 2DoF mechanical systems with strong nonlinearities of vibro-impact type to demonstrate the effectiveness of the proposed technique.

  20. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.

    Science.gov (United States)

    García, Constantino A; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  1. [Correlation analysis between residual displacement and hip function after reconstruction of acetabular fractures].

    Science.gov (United States)

    Ma, Kunlong; Fang, Yue; Luan, Fujun; Tu, Chongqi; Yang, Tianfu

    2012-03-01

    To investigate the relationships between residual displacement of weight-bearing and non weight-bearing zones (gap displacement and step displacement) and hip function by analyzing the CT images after reconstruction of acetabular fractures. The CT measures and clinical outcome were retrospectively analyzed from 48 patients with displaced acetabular fracture between June 2004 and June 2009. All patients were treated by open reduction and internal fixation, and were followed up 24 to 72 months (mean, 36 months); all fractures healed after operation. The residual displacement involved the weight-bearing zone in 30 cases (weight-bearing group), and involved the non weight-bearing zone in 18 cases (non weight-bearing group). The clinical outcomes were evaluated by Merle d'Aubigné-Postel criteria, and the reduction of articular surface by CT images, including the maximums of two indexes (gap displacement and step displacement). All the data were analyzed in accordance with the Spearman rank correlation coefficient analysis. There was strong negative correlation between the hip function and the residual displacement values in weight-bearing group (r(s) = -0.722, P = 0.001). But there was no correlation between the hip function and the residual displacement values in non weight-bearing group (r(s) = 0.481, P = 0.059). The results of clinical follow-up were similar to the correlation analysis results. In weight-bearing group, the hip function had strong negative correlation with step displacement (r(s) = 0.825, P = 0.002), but it had no correlation with gap displacement (r(s) = 0.577, P = 0.134). In patients with acetabular fracture, the hip function has correlation not only with the extent of the residual displacement but also with the location of the residual displacement, so the residual displacement of weight-bearing zone is a key factor to affect the hip function. In patients with residual displacement in weight-bearing zone, the bigger the step displacement is, the

  2. Forensic analysis of explosive residues from hand swabs

    International Nuclear Information System (INIS)

    Umi Khairul Ahmad; Sumathy Rajendran; Syahidah Abu Hassan

    2008-01-01

    In the forensic examination of physical evidence for organic explosives, cotton swabs are often used to collect residue from surfaces, such as skin and post-blast debris. A preliminary study has been conducted to develop extraction method of a common energetic compound, pentaerythritol tetranitrate (PETN) from hand swabs followed by direct analysis of the resulting extract solution using high-performance liquid chromatography (HPLC) with ultraviolet (UV) detector. Analysis was performed on an octadecylsilane-based (C 18 ) column using acetonitrile-water mixture (55:45) as mobile phase. The mobile phase was pumped at 1.0 mL/ min and separation affected using an isocratic mode with the detection wavelength of 230 nm. The explosive residue was extracted from cotton swabs using acetone in an ultrasonic cold bath. The developed method was later applied to the real hand swabs samples, which were taken from three army personnel who handled PETN during a munition disposal operation at Asahan Camp Military Firing range. The acetone extract obtained using sonication method was found to be effective in recovering PETN from cotton swabs with relatively high recovery (89.5 %) and good sensitivity with detection limit as low as 2 ng. The content of PETN in the real hand swab samples were found to be in the range of 4.7-130 mg. (author)

  3. Finite element analysis and measurement for residual stress of dissimilar metal weld in pressurizer safety nozzle mockup

    International Nuclear Information System (INIS)

    Lee, Kyoung Soo; Kim, W.; Lee, Jeong Geun; Park, Chi Yong; Yang, Jun Seok; Kim, Tae Ryong; Park, Jai Hak

    2009-01-01

    Finite element (FE) analysis and experiment for weld residual stress (WRS) in the pressurizer safety nozzle mockup is described in various processes and results. Foremost of which is the dissimilar simulation metal welding (DMW) between carbon steel and austenitic stainless steel. Thermal and structural analyses were compared with actual residual stress, and actual measurements of. Magnitude and distribution of WRS in the nozzle mockup were assessed. Two measurement methods were used: hole-drilling method (HDM) with strain gauge for residual stress on the surface of the mockup, and block removal and splitting layer (BRSL) method for through-thickness. FE analysis and measurement data showed good agreement. In conclusion, the characteristics of weld residual stress of DMW could be well understood and the simplified FE analysis was verified as acceptable for estimating WRS

  4. Finite element analysis and measurement for residual stress of dissimilar metal weld in pressurizer safety nozzle mockup

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyoung Soo; Kim, W.; Lee, Jeong Geun; Park, Chi Yong; Yang, Jun Seok; Kim, Tae Ryong [Korea Electric Power Research Institute, Daejeon (Korea, Republic of); Park, Jai Hak [Chungbuk University, Cheongju (Korea, Republic of)

    2009-11-15

    Finite element (FE) analysis and experiment for weld residual stress (WRS) in the pressurizer safety nozzle mockup is described in various processes and results. Foremost of which is the dissimilar simulation metal welding (DMW) between carbon steel and austenitic stainless steel. Thermal and structural analyses were compared with actual residual stress, and actual measurements of. Magnitude and distribution of WRS in the nozzle mockup were assessed. Two measurement methods were used: hole-drilling method (HDM) with strain gauge for residual stress on the surface of the mockup, and block removal and splitting layer (BRSL) method for through-thickness. FE analysis and measurement data showed good agreement. In conclusion, the characteristics of weld residual stress of DMW could be well understood and the simplified FE analysis was verified as acceptable for estimating WRS

  5. Scanning electron microscope/energy dispersive x ray analysis of impact residues in LDEF tray clamps

    Science.gov (United States)

    Bernhard, Ronald P.; Durin, Christian; Zolensky, Michael E.

    1993-01-01

    Detailed optical scanning of tray clamps is being conducted in the Facility for the Optical Inspection of Large Surfaces at JSC to locate and document impacts as small as 40 microns in diameter. Residues from selected impacts are then being characterized by Scanning Electron Microscopy/Energy Dispersive X-ray Analysis at CNES. Results from this analysis will be the initial step to classifying projectile residues into specific sources.

  6. Analyzing cost efficient production behavior under economies of scope : A nonparametric methodology

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Vermeulen, F.M.P.

    2008-01-01

    In designing a production model for firms that generate multiple outputs, we take as a starting point that such multioutput production refers to economies of scope, which in turn originate from joint input use and input externalities. We provide a nonparametric characterization of cost-efficient

  7. Residual stress analysis in reactor pressure vessel attachments

    International Nuclear Information System (INIS)

    Dexter, R.J.; Pont, D.

    1991-08-01

    Residual stresses in cladding and welded attachments could contribute to the problem of stress-corrosion cracking in boiling-water reactors (BWR). As part of a larger program aimed at quantifying residual stress in BWR components, models that would be applicable for predicting residual stress in BWR components are reviewed and documented. The review includes simple methods of estimating residual stresses as well as advanced finite-element software. In general, simple methods are capable of predicting peak magnitudes of residual stresses but are incapable of adequately characterizing the distribution of residual stresses. Ten groups of researchers using finite-element software are reviewed in detail. For each group, the assumptions of the model, possible simplifications, material property data, and specific applications are discussed. The most accurate results are obtained when a metallurgical simulation is performed, transformation plasticity effects are included, and the heating and cooling parts of the welding thermal cycle are simulated. Two models are identified which can provide these features. The present state of these models and the material property data available in the literature are adequate to quantify residual stress in BWR components

  8. A non-parametric Bayesian approach to decompounding from high frequency data

    NARCIS (Netherlands)

    Gugushvili, Shota; van der Meulen, F.H.; Spreij, Peter

    2016-01-01

    Given a sample from a discretely observed compound Poisson process, we consider non-parametric estimation of the density f0 of its jump sizes, as well as of its intensity λ0. We take a Bayesian approach to the problem and specify the prior on f0 as the Dirichlet location mixture of normal densities.

  9. Practical analysis of specificity-determining residues in protein families.

    Science.gov (United States)

    Chagoyen, Mónica; García-Martín, Juan A; Pazos, Florencio

    2016-03-01

    Determining the residues that are important for the molecular activity of a protein is a topic of broad interest in biomedicine and biotechnology. This knowledge can help understanding the protein's molecular mechanism as well as to fine-tune its natural function eventually with biotechnological or therapeutic implications. Some of the protein residues are essential for the function common to all members of a family of proteins, while others explain the particular specificities of certain subfamilies (like binding on different substrates or cofactors and distinct binding affinities). Owing to the difficulty in experimentally determining them, a number of computational methods were developed to detect these functional residues, generally known as 'specificity-determining positions' (or SDPs), from a collection of homologous protein sequences. These methods are mature enough for being routinely used by molecular biologists in directing experiments aimed at getting insight into the functional specificity of a family of proteins and eventually modifying it. In this review, we summarize some of the recent discoveries achieved through SDP computational identification in a number of relevant protein families, as well as the main approaches and software tools available to perform this type of analysis. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. kruX: matrix-based non-parametric eQTL discovery.

    Science.gov (United States)

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  11. A nonparametric mean-variance smoothing method to assess Arabidopsis cold stress transcriptional regulator CBF2 overexpression microarray data.

    Science.gov (United States)

    Hu, Pingsha; Maiti, Tapabrata

    2011-01-01

    Microarray is a powerful tool for genome-wide gene expression analysis. In microarray expression data, often mean and variance have certain relationships. We present a non-parametric mean-variance smoothing method (NPMVS) to analyze differentially expressed genes. In this method, a nonlinear smoothing curve is fitted to estimate the relationship between mean and variance. Inference is then made upon shrinkage estimation of posterior means assuming variances are known. Different methods have been applied to simulated datasets, in which a variety of mean and variance relationships were imposed. The simulation study showed that NPMVS outperformed the other two popular shrinkage estimation methods in some mean-variance relationships; and NPMVS was competitive with the two methods in other relationships. A real biological dataset, in which a cold stress transcription factor gene, CBF2, was overexpressed, has also been analyzed with the three methods. Gene ontology and cis-element analysis showed that NPMVS identified more cold and stress responsive genes than the other two methods did. The good performance of NPMVS is mainly due to its shrinkage estimation for both means and variances. In addition, NPMVS exploits a non-parametric regression between mean and variance, instead of assuming a specific parametric relationship between mean and variance. The source code written in R is available from the authors on request.

  12. Trends and advances in pesticide residue analysis | Yeboah ...

    African Journals Online (AJOL)

    The nature, origin and the economic significance of pesticide residues are reviewed to underscore the need for countries to develop the ability and capacity to monitor pesticide residues. An overview of pesticide residues analytical procedures is also presented with emphasis on thin layer chromatography (TLC) as an ...

  13. Exact nonparametric confidence bands for the survivor function.

    Science.gov (United States)

    Matthews, David

    2013-10-12

    A method to produce exact simultaneous confidence bands for the empirical cumulative distribution function that was first described by Owen, and subsequently corrected by Jager and Wellner, is the starting point for deriving exact nonparametric confidence bands for the survivor function of any positive random variable. We invert a nonparametric likelihood test of uniformity, constructed from the Kaplan-Meier estimator of the survivor function, to obtain simultaneous lower and upper bands for the function of interest with specified global confidence level. The method involves calculating a null distribution and associated critical value for each observed sample configuration. However, Noe recursions and the Van Wijngaarden-Decker-Brent root-finding algorithm provide the necessary tools for efficient computation of these exact bounds. Various aspects of the effect of right censoring on these exact bands are investigated, using as illustrations two observational studies of survival experience among non-Hodgkin's lymphoma patients and a much larger group of subjects with advanced lung cancer enrolled in trials within the North Central Cancer Treatment Group. Monte Carlo simulations confirm the merits of the proposed method of deriving simultaneous interval estimates of the survivor function across the entire range of the observed sample. This research was supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada. It was begun while the author was visiting the Department of Statistics, University of Auckland, and completed during a subsequent sojourn at the Medical Research Council Biostatistics Unit in Cambridge. The support of both institutions, in addition to that of NSERC and the University of Waterloo, is greatly appreciated.

  14. Analyzing Cost Efficient Production Behavior Under Economies of Scope : A Nonparametric Methodology

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Vermeulen, F.M.P.

    2006-01-01

    In designing a production model for firms that generate multiple outputs, we take as a starting point that such multi-output production refers to economies of scope, which in turn originate from joint input use and input externalities. We provide a nonparametric characterization of cost efficient

  15. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin; Tong, Tiejun; Zhu, Lixing

    2017-01-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  16. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin

    2017-09-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  17. A non-parametric Data Envelopment Analysis approach for improving energy efficiency of grape production

    International Nuclear Information System (INIS)

    Khoshroo, Alireza; Mulwa, Richard; Emrouznejad, Ali; Arabi, Behrouz

    2013-01-01

    Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production. In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming. The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. - Highlights: • The focus of this paper is to identify excessive use of energy and optimize energy consumption in grape production. • We measure the efficiency as a function of labor/machinery/chemicals/farmyard manure/diesel-fuel/electricity/water. • Data were obtained from 41 grape

  18. On residual gas analysis during high temperature baking of graphite tiles

    International Nuclear Information System (INIS)

    Prakash, A A; Chaudhuri, P; Khirwadkar, S; Reddy, D Chenna; Saxena, Y C; Chauhan, N; Raole, P M

    2008-01-01

    Steady-state Super-conducting Tokamak-1 (SST-1) is a medium size tokamak with major radius of 1.1 m and minor radius of 0.20 m. It is designed for plasma discharge duration of 1000 seconds to obtain fully steady-state plasma operation. Plasma Facing Components (PFC), consisting of divertors, passive stabilizers, baffles and poloidal limiters are also designed to be UHV compatible for steady state operation. All PFC are made up of graphite tiles mechanically attached to the copper alloy substrate. Graphite is one of the preferred first wall armour material in present day tokamaks. High thermal shock resistance and low atomic number of carbon are the most important properties of graphite for this application. High temperature vacuum baking of graphite tiles is the standard process to remove the impurities. Residual Gas Analyzer (RGA) has been used for qualitative and quantitative measurements of released gases from graphite tiles during baking. Surface Analysis of graphite tiles has also been done before and after baking. This paper describes the residual gas analysis during baking and surface analysis of graphite tiles

  19. On residual gas analysis during high temperature baking of graphite tiles

    Energy Technology Data Exchange (ETDEWEB)

    Prakash, A A; Chaudhuri, P; Khirwadkar, S; Reddy, D Chenna; Saxena, Y C [Institute for Plasma Research, Bhat, Gandhinagar - 382 428 (India); Chauhan, N; Raole, P M [Facilitation Center for Industrial Plasma Technologies, IPR, Gandhinagar (India)], E-mail: arun@ipr.res.in

    2008-05-01

    Steady-state Super-conducting Tokamak-1 (SST-1) is a medium size tokamak with major radius of 1.1 m and minor radius of 0.20 m. It is designed for plasma discharge duration of 1000 seconds to obtain fully steady-state plasma operation. Plasma Facing Components (PFC), consisting of divertors, passive stabilizers, baffles and poloidal limiters are also designed to be UHV compatible for steady state operation. All PFC are made up of graphite tiles mechanically attached to the copper alloy substrate. Graphite is one of the preferred first wall armour material in present day tokamaks. High thermal shock resistance and low atomic number of carbon are the most important properties of graphite for this application. High temperature vacuum baking of graphite tiles is the standard process to remove the impurities. Residual Gas Analyzer (RGA) has been used for qualitative and quantitative measurements of released gases from graphite tiles during baking. Surface Analysis of graphite tiles has also been done before and after baking. This paper describes the residual gas analysis during baking and surface analysis of graphite tiles.

  20. Nonparametric Efficiency Testing of Asian Stock Markets Using Weekly Data

    OpenAIRE

    CORNELIS A. LOS

    2004-01-01

    The efficiency of speculative markets, as represented by Fama's 1970 fair game model, is tested on weekly price index data of six Asian stock markets - Hong Kong, Indonesia, Malaysia, Singapore, Taiwan and Thailand - using Sherry's (1992) non-parametric methods. These scientific testing methods were originally developed to analyze the information processing efficiency of nervous systems. In particular, the stationarity and independence of the price innovations are tested over ten years, from ...

  1. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  2. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  3. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    Science.gov (United States)

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is

  4. Residual stress by repair welds

    International Nuclear Information System (INIS)

    Mochizuki, Masahito; Toyoda, Masao

    2003-01-01

    Residual stress by repair welds is computed using the thermal elastic-plastic analysis with phase-transformation effect. Coupling phenomena of temperature, microstructure, and stress-strain fields are simulated in the finite-element analysis. Weld bond of a plate butt-welded joint is gouged and then deposited by weld metal in repair process. Heat source is synchronously moved with the deposition of the finite-element as the weld deposition. Microstructure is considered by using CCT diagram and the transformation behavior in the repair weld is also simulated. The effects of initial stress, heat input, and weld length on residual stress distribution are studied from the organic results of numerical analysis. Initial residual stress before repair weld has no influence on the residual stress after repair treatment near weld metal, because the initial stress near weld metal releases due to high temperature of repair weld and then stress by repair weld regenerates. Heat input has an effect for residual stress distribution, for not its magnitude but distribution zone. Weld length should be considered reducing the magnitude of residual stress in the edge of weld bead; short bead induces high tensile residual stress. (author)

  5. Assessing Goodness of Fit in Item Response Theory with Nonparametric Models: A Comparison of Posterior Probabilities and Kernel-Smoothing Approaches

    Science.gov (United States)

    Sueiro, Manuel J.; Abad, Francisco J.

    2011-01-01

    The distance between nonparametric and parametric item characteristic curves has been proposed as an index of goodness of fit in item response theory in the form of a root integrated squared error index. This article proposes to use the posterior distribution of the latent trait as the nonparametric model and compares the performance of an index…

  6. FEM Analysis and Measurement of Residual Stress by Neutron Diffraction on the Dissimilar Overlay Weld Pipe

    International Nuclear Information System (INIS)

    Kim, Kang Soo; Lee, Ho Jin; Woo, Wan Chuck; Seong, Baek Seok; Byeon, Jin Gwi; Park, Kwang Soo; Jung, In Chul

    2010-01-01

    Much research has been done to estimate the residual stress on a dissimilar metal weld. There are many methods to estimate the weld residual stress and FEM (Finite Element Method) is generally used due to the advantage of the parametric study. And the X-ray method and a Hole Drilling technique for an experimental method are also usually used. The aim of this paper is to develop the appropriate FEM model to estimate the residual stresses of the dissimilar overlay weld pipe. For this, firstly, the specimen of the dissimilar overlay weld pipe was manufactured. The SA 508 Gr3 nozzle, the SA 182 safe end and SA376 pipe were welded by the Alloy 182. And the overlay weld by the Alloy 52M was performed. The residual stress of this specimen was measured by using the Neutron Diffraction device in the HANARO (High-flux Advanced Neutron Application ReactOr) research reactor, KAERI (Korea Atomic Energy Research Institute). Secondly, FEM Model on the dissimilar overlay weld pipe was made and analyzed by the ABAQUS Code (ABAQUS, 2004). Thermal analysis and stress analysis were performed, and the residual stress was calculated. Thirdly, the results of the FEM analysis were compared with those of the experimental methods

  7. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  8. Non-parametric trend analysis of the aridity index for three large arid and semi-arid basins in Iran

    Science.gov (United States)

    Ahani, Hossien; Kherad, Mehrzad; Kousari, Mohammad Reza; van Roosmalen, Lieke; Aryanfar, Ramin; Hosseini, Seyyed Mashaallah

    2013-05-01

    Currently, an important scientific challenge that researchers are facing is to gain a better understanding of climate change at the regional scale, which can be especially challenging in an area with low and highly variable precipitation amounts such as Iran. Trend analysis of the medium-term change using ground station observations of meteorological variables can enhance our knowledge of the dominant processes in an area and contribute to the analysis of future climate projections. Generally, studies focus on the long-term variability of temperature and precipitation and to a lesser extent on other important parameters such as moisture indices. In this study the recent 50-year trends (1955-2005) of precipitation (P), potential evapotranspiration (PET), and aridity index (AI) in monthly time scale were studied over 14 synoptic stations in three large Iran basins using the Mann-Kendall non-parametric test. Additionally, an analysis of the monthly, seasonal and annual trend of each parameter was performed. Results showed no significant trends in the monthly time series. However, PET showed significant, mostly decreasing trends, for the seasonal values, which resulted in a significant negative trend in annual PET at five stations. Significant negative trends in seasonal P values were only found at a number of stations in spring and summer and no station showed significant negative trends in annual P. Due to the varied positive and negative trends in annual P and to a lesser extent PET, almost as many stations with negative as positive trends in annual AI were found, indicating that both drying and wetting trends occurred in Iran. Overall, the northern part of the study area showed an increasing trend in annual AI which meant that the region became wetter, while the south showed decreasing trends in AI.

  9. Residue conservation and dimer-interface analysis of olfactory receptor molecular models

    Directory of Open Access Journals (Sweden)

    Ramanathan Sowdhamini

    2012-10-01

    Full Text Available Olfactory Receptors (ORs are members of the Class A rhodopsin like G-protein coupled receptors (GPCRs which are the initial players in the signal transduction cascade, leading to the generation of nerve impulses transmitted to the brain and resulting in the detection of odorant molecules. Despite the accumulation of thousands of olfactory receptor sequences, no crystal structures of ORs are known tο date. However, the recent availability of crystallographic models of a few GPCRs allows us to generate homology models of ORs and analyze their amino acid patterns, as there is a huge diversity in OR sequences. In this study, we have generated three-dimensional models of 100 representative ORs from Homo sapiens, Mus musculus, Drosophila melanogaster, Caenorhabditis elegans and Sacharomyces cerevisiae which were selected on the basis of a composite classification scheme and phylogenetic analysis. The crystal structure of bovine rhodopsin was used as a template and it was found that the full-length models have more than 90% of their residues in allowed regions of the Ramachandran plot. The structures were further used for analysis of conserved residues in the transmembrane and extracellular loop regions in order to identify functionally important residues. Several ORs are known to be functional as dimers and hence dimer interfaces were predicted for OR models to analyse their oligomeric functional state.

  10. Mineral analysis, anthocyanins and phenolic compounds in wine residues flour

    Directory of Open Access Journals (Sweden)

    Bennemann Gabriela Datsch

    2016-01-01

    Full Text Available This study analyzed the mineral content (N, P, K, S, Ca, Fe, Mg, Mn, Fe and Zn, anthocyanins and phenolic compounds in flours produced from residues of different grape cultivars from the wineries in the Southern region of Brazil. Mineral analysis showed a significant difference for all grape cultivar, with the exception for phosphorus content. Residues from cv. Seibel showed higher levels of N, Cu and Mg. The cultivars Ancelotta, Tanat and Bordô present higher contents of K, Zn, Mn, Fe and Ca. For the concentration of anthocyanins, cultivars Cabernet Sauvignon (114.7 mg / 100g, Tannat (88.5 mg / 100 g and Ancelotta (33.8 mg/100 g had the highest concentrations. The cultivars Pinot Noir (7.0 g AGE / 100 g, Tannat (4.3 g AGE / 100 g, and Ancelotta (3.9 g AGE / 100 g had the highest content of phenolic compounds. Considering these results, it became evident the potential of using the residue of winemaking to produce flour for human consumption, highlighting the grapes ‘Tannat' and ‘Ancellotta'.

  11. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  12. Bioethanol production from forestry residues: A comparative techno-economic analysis

    International Nuclear Information System (INIS)

    Frankó, Balázs; Galbe, Mats; Wallberg, Ola

    2016-01-01

    Highlights: • A proposed cellulosic ethanol biorefinery in Sweden was simulated with Aspen Plus. • Forestry residues with different bark contents were evaluated as raw materials. • The bark content negatively influenced the minimum ethanol selling price. • Sensitivity analyses were performed to assess the influence of raw material cost. - Abstract: A techno-economic analysis was conducted to assess the feasibility of using forestry residues with different bark contents for bioethanol production. A proposed cellulosic ethanol biorefinery in Sweden was simulated with Aspen Plus. The plant was assumed to convert different forestry assortments (sawdust and shavings, fuel logs, early thinnings, tops and branches, hog fuel and pulpwood) to ethanol, pellets, biogas and electricity. The intention was not to obtain absolute ethanol production costs for future facilities, but to assess and compare the future potential of utilizing different forestry residues for bioethanol production. The same plant design and operating conditions were assumed in all cases, and the effect of including bark on the whole conversion process, especially how it influenced the ethanol production cost, was studied. While the energy efficiency (not including district heating) obtained for the whole process was between 67 and 69% regardless of the raw material used, the ethanol production cost differed considerably; the minimum ethanol selling price ranging from 0.77 to 1.52 USD/L. Under the basic assumptions, all the forestry residues apart from sawdust and shavings exhibited a negative net present value at current market prices. The profitability decreased with increasing bark content of the raw material. Sensitivity analyses showed that, at current market prices, the utilization of bark-containing forestry residues will not provide significant cost improvement compared with pulpwood unless the conversion of cellulose and hemicellulose to monomeric sugars is improved.

  13. Residual volume in vials of antibiotics used in pediatrics.

    Science.gov (United States)

    Chaves, Caroline Magna Pessoa; Bezerra, Carolina Martins; Lima, Francisca Elisângela Teixeira; Cardoso, Maria Vera Lúcia Moreira Leitão; Fonseca, Said Gonçalves da Cruz; Silva, Viviane Martins da

    2017-06-12

    Quantifying residual volume contained in vials of antibiotics used in pediatrics. This is an experiment involving samples from vials of antibiotics used in a pediatric hospital. Residual volume was identified by calculating the difference in weight measurement before and after the vials were washed. Evaluation of the residual volume difference in the vials was determined by the Wilcoxon non-parametric test for a sample and established at a significance level of 5%. 105 samples of antibiotics were selected. The correct use of the antibiotics oxacillin (88.57%) and ceftriaxone (94.28%) predominated with low residual values. The same did not occur for procaine benzylpenicillin + potassium benzylpenicillin, since a greater residual volume was discarded in 74.28% of the vials. We highlight the need for improvements in managing antibiotics in the institution under study, so that the excess volume of the antibiotics in the vials is used within the acceptable stable time. It is also necessary that the disposal of the residual volume be adequately disposed, since it presents a risk to public health and the environment. Quantificar o volume residual contido em frascos-ampola de antibióticos utilizados na pediatria. Trata-se de um experimento com amostras de frascos-ampola de antibióticos utilizados em hospital pediátrico. O volume residual foi identificado calculando-se a diferença da aferição do peso antes e após a lavagem do frasco-ampola. A avaliação da diferença dos volumes residuais nos frascos-ampola foi determinada pelo teste não paramétrico de Wilcoxon para uma amostra e estabelecido o nível de significância de 5%. Foram selecionadas 105 amostras de antibióticos. Predominou o correto aproveitamento dos antibióticos oxacilina (88,57%) e ceftriaxona (94,28%), com baixos valores residuais. O mesmo não ocorreu com a benzilpenicilina procaína + potássica, pois em 74,28% dos frascos houve descarte de volume residual superior. Destaca-se a necessidade de

  14. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  15. Hyperspectral image segmentation using a cooperative nonparametric approach

    Science.gov (United States)

    Taher, Akar; Chehdi, Kacem; Cariou, Claude

    2013-10-01

    In this paper a new unsupervised nonparametric cooperative and adaptive hyperspectral image segmentation approach is presented. The hyperspectral images are partitioned band by band in parallel and intermediate classification results are evaluated and fused, to get the final segmentation result. Two unsupervised nonparametric segmentation methods are used in parallel cooperation, namely the Fuzzy C-means (FCM) method, and the Linde-Buzo-Gray (LBG) algorithm, to segment each band of the image. The originality of the approach relies firstly on its local adaptation to the type of regions in an image (textured, non-textured), and secondly on the introduction of several levels of evaluation and validation of intermediate segmentation results before obtaining the final partitioning of the image. For the management of similar or conflicting results issued from the two classification methods, we gradually introduced various assessment steps that exploit the information of each spectral band and its adjacent bands, and finally the information of all the spectral bands. In our approach, the detected textured and non-textured regions are treated separately from feature extraction step, up to the final classification results. This approach was first evaluated on a large number of monocomponent images constructed from the Brodatz album. Then it was evaluated on two real applications using a respectively multispectral image for Cedar trees detection in the region of Baabdat (Lebanon) and a hyperspectral image for identification of invasive and non invasive vegetation in the region of Cieza (Spain). A correct classification rate (CCR) for the first application is over 97% and for the second application the average correct classification rate (ACCR) is over 99%.

  16. Transition redshift: new constraints from parametric and nonparametric methods

    Energy Technology Data Exchange (ETDEWEB)

    Rani, Nisha; Mahajan, Shobhit; Mukherjee, Amitabha [Department of Physics and Astrophysics, University of Delhi, New Delhi 110007 (India); Jain, Deepak [Deen Dayal Upadhyaya College, University of Delhi, New Delhi 110015 (India); Pires, Nilza, E-mail: nrani@physics.du.ac.in, E-mail: djain@ddu.du.ac.in, E-mail: shobhit.mahajan@gmail.com, E-mail: amimukh@gmail.com, E-mail: npires@dfte.ufrn.br [Departamento de Física Teórica e Experimental, UFRN, Campus Universitário, Natal, RN 59072-970 (Brazil)

    2015-12-01

    In this paper, we use the cosmokinematics approach to study the accelerated expansion of the Universe. This is a model independent approach and depends only on the assumption that the Universe is homogeneous and isotropic and is described by the FRW metric. We parametrize the deceleration parameter, q(z), to constrain the transition redshift (z{sub t}) at which the expansion of the Universe goes from a decelerating to an accelerating phase. We use three different parametrizations of q(z) namely, q{sub I}(z)=q{sub 1}+q{sub 2}z, q{sub II} (z) = q{sub 3} + q{sub 4} ln (1 + z) and q{sub III} (z)=½+q{sub 5}/(1+z){sup 2}. A joint analysis of the age of galaxies, strong lensing and supernovae Ia data indicates that the transition redshift is less than unity i.e. z{sub t} < 1. We also use a nonparametric approach (LOESS+SIMEX) to constrain z{sub t}. This too gives z{sub t} < 1 which is consistent with the value obtained by the parametric approach.

  17. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  18. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  19. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable...... of the energy at mean-level crossings, which yields the damping relative to white noise intensity. Finally, an estimate of the noise intensity is extracted by estimating the absolute damping from the autocovariance functions of a set of modified phase plane variables at different energy levels. The method...

  20. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2018-01-30

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Testing a parametric function against a nonparametric alternative in IV and GMM settings

    DEFF Research Database (Denmark)

    Gørgens, Tue; Wurtz, Allan

    This paper develops a specification test for functional form for models identified by moment restrictions, including IV and GMM settings. The general framework is one where the moment restrictions are specified as functions of data, a finite-dimensional parameter vector, and a nonparametric real ...

  2. Efficient identification of critical residues based only on protein structure by network analysis.

    Directory of Open Access Journals (Sweden)

    Michael P Cusack

    2007-05-01

    Full Text Available Despite the increasing number of published protein structures, and the fact that each protein's function relies on its three-dimensional structure, there is limited access to automatic programs used for the identification of critical residues from the protein structure, compared with those based on protein sequence. Here we present a new algorithm based on network analysis applied exclusively on protein structures to identify critical residues. Our results show that this method identifies critical residues for protein function with high reliability and improves automatic sequence-based approaches and previous network-based approaches. The reliability of the method depends on the conformational diversity screened for the protein of interest. We have designed a web site to give access to this software at http://bis.ifc.unam.mx/jamming/. In summary, a new method is presented that relates critical residues for protein function with the most traversed residues in networks derived from protein structures. A unique feature of the method is the inclusion of the conformational diversity of proteins in the prediction, thus reproducing a basic feature of the structure/function relationship of proteins.

  3. ELASTIC-PLASTIC AND RESIDUAL STRESS ANALYSIS OF AN ALUMINUM DISC UNDER INTERNAL PRESSURES

    Directory of Open Access Journals (Sweden)

    Numan Behlül BEKTAŞ

    2004-02-01

    Full Text Available This paper deals with elastic-plastic stress analysis of a thin aluminum disc under internal pressures. An analytical solution is performed for satisfying elastic-plastic stress-strain relations and boundary conditions for small plastic deformations. The Von-Mises Criterion is used as a yield criterion, and elastic perfectly plastic material is assumed. Elastic-plastic and residual stress distributions are obtained from inner radius to outer radius, and they are presented in tables and figures. All radial stress components, ?r, are compressive, and they are highest at the inner radius. All tangential stress components, ??, are tensile, and they are highest where the plastic deformation begins. Magnitude of the tangential residual stresses is higher than those the radial residual stresses.

  4. Proposing a framework for airline service quality evaluation using Type-2 Fuzzy TOPSIS and non-parametric analysis

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper focuses on evaluating airline service quality from the perspective of passengers' view. Until now a lot of researches has been performed in airline service quality evaluation in the world but a little research has been conducted in Iran, yet. In this study, a framework for measuring airline service quality in Iran is proposed. After reviewing airline service quality criteria, SSQAI model was selected because of its comprehensiveness in covering airline service quality dimensions. SSQAI questionnaire items were redesigned to adopt with Iranian airlines requirements and environmental circumstances in the Iran's economic and cultural context. This study includes fuzzy decision-making theory, considering the possible fuzzy subjective judgment of the evaluators during airline service quality evaluation. Fuzzy TOPSIS have been applied for ranking airlines service quality performances. Three major Iranian airlines which have the most passenger transfer volumes in domestic and foreign flights were chosen for evaluation in this research. Results demonstrated Mahan airline has got the best service quality performance rank in gaining passengers' satisfaction with delivery of high-quality services to its passengers, among the three major Iranian airlines. IranAir and Aseman airlines placed in the second and third rank, respectively, according to passenger's evaluation. Statistical analysis has been used in analyzing passenger responses. Due to the abnormality of data, Non-parametric tests were applied. To demonstrate airline ranks in every criterion separately, Friedman test was performed. Variance analysis and Tukey test were applied to study the influence of increasing in age and educational level of passengers on degree of their satisfaction from airline's service quality. Results showed that age has no significant relation to passenger satisfaction of airlines, however, increasing in educational level demonstrated a negative impact on

  5. Analysis and modeling of simulated residual stress of mold injected plastic parts by using robust correlations

    OpenAIRE

    Vargas, Carlos; Sierra, Juan; Posada, Juan; Botero-Cadavid, Juan F.

    2017-01-01

    ABSTRACT The injection molding process is the most widely used processing technique for polymers. The analysis of residual stresses generated during this process is crucial for the part quality assessment. The present study evaluates the residual stresses in a tensile strength specimen using the simulation software Moldex3D for two polymers, polypropylene and polycarbonate. The residual stresses obtained under a simulated design of experiment were modeled using a robust multivariable regressi...

  6. A simple non-parametric goodness-of-fit test for elliptical copulas

    Directory of Open Access Journals (Sweden)

    Jaser Miriam

    2017-12-01

    Full Text Available In this paper, we propose a simple non-parametric goodness-of-fit test for elliptical copulas of any dimension. It is based on the equality of Kendall’s tau and Blomqvist’s beta for all bivariate margins. Nominal level and power of the proposed test are investigated in a Monte Carlo study. An empirical application illustrates our goodness-of-fit test at work.

  7. Bayesian Nonparametric Mixture Estimation for Time-Indexed Functional Data in R

    Directory of Open Access Journals (Sweden)

    Terrance D. Savitsky

    2016-08-01

    Full Text Available We present growfunctions for R that offers Bayesian nonparametric estimation models for analysis of dependent, noisy time series data indexed by a collection of domains. This data structure arises from combining periodically published government survey statistics, such as are reported in the Current Population Study (CPS. The CPS publishes monthly, by-state estimates of employment levels, where each state expresses a noisy time series. Published state-level estimates from the CPS are composed from household survey responses in a model-free manner and express high levels of volatility due to insufficient sample sizes. Existing software solutions borrow information over a modeled time-based dependence to extract a de-noised time series for each domain. These solutions, however, ignore the dependence among the domains that may be additionally leveraged to improve estimation efficiency. The growfunctions package offers two fully nonparametric mixture models that simultaneously estimate both a time and domain-indexed dependence structure for a collection of time series: (1 A Gaussian process (GP construction, which is parameterized through the covariance matrix, estimates a latent function for each domain. The covariance parameters of the latent functions are indexed by domain under a Dirichlet process prior that permits estimation of the dependence among functions across the domains: (2 An intrinsic Gaussian Markov random field prior construction provides an alternative to the GP that expresses different computation and estimation properties. In addition to performing denoised estimation of latent functions from published domain estimates, growfunctions allows estimation of collections of functions for observation units (e.g., households, rather than aggregated domains, by accounting for an informative sampling design under which the probabilities for inclusion of observation units are related to the response variable. growfunctions includes plot

  8. Rigid Residue Scan Simulations Systematically Reveal Residue Entropic Roles in Protein Allostery.

    Directory of Open Access Journals (Sweden)

    Robert Kalescky

    2016-04-01

    Full Text Available Intra-protein information is transmitted over distances via allosteric processes. This ubiquitous protein process allows for protein function changes due to ligand binding events. Understanding protein allostery is essential to understanding protein functions. In this study, allostery in the second PDZ domain (PDZ2 in the human PTP1E protein is examined as model system to advance a recently developed rigid residue scan method combining with configurational entropy calculation and principal component analysis. The contributions from individual residues to whole-protein dynamics and allostery were systematically assessed via rigid body simulations of both unbound and ligand-bound states of the protein. The entropic contributions of individual residues to whole-protein dynamics were evaluated based on covariance-based correlation analysis of all simulations. The changes of overall protein entropy when individual residues being held rigid support that the rigidity/flexibility equilibrium in protein structure is governed by the La Châtelier's principle of chemical equilibrium. Key residues of PDZ2 allostery were identified with good agreement with NMR studies of the same protein bound to the same peptide. On the other hand, the change of entropic contribution from each residue upon perturbation revealed intrinsic differences among all the residues. The quasi-harmonic and principal component analyses of simulations without rigid residue perturbation showed a coherent allosteric mode from unbound and bound states, respectively. The projection of simulations with rigid residue perturbation onto coherent allosteric modes demonstrated the intrinsic shifting of ensemble distributions supporting the population-shift theory of protein allostery. Overall, the study presented here provides a robust and systematic approach to estimate the contribution of individual residue internal motion to overall protein dynamics and allostery.

  9. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  10. gRINN: a tool for calculation of residue interaction energies and protein energy network analysis of molecular dynamics simulations.

    Science.gov (United States)

    Serçinoglu, Onur; Ozbek, Pemra

    2018-05-25

    Atomistic molecular dynamics (MD) simulations generate a wealth of information related to the dynamics of proteins. If properly analyzed, this information can lead to new insights regarding protein function and assist wet-lab experiments. Aiming to identify interactions between individual amino acid residues and the role played by each in the context of MD simulations, we present a stand-alone software called gRINN (get Residue Interaction eNergies and Networks). gRINN features graphical user interfaces (GUIs) and a command-line interface for generating and analyzing pairwise residue interaction energies and energy correlations from protein MD simulation trajectories. gRINN utilizes the features of NAMD or GROMACS MD simulation packages and automatizes the steps necessary to extract residue-residue interaction energies from user-supplied simulation trajectories, greatly simplifying the analysis for the end-user. A GUI, including an embedded molecular viewer, is provided for visualization of interaction energy time-series, distributions, an interaction energy matrix, interaction energy correlations and a residue correlation matrix. gRINN additionally offers construction and analysis of Protein Energy Networks, providing residue-based metrics such as degrees, betweenness-centralities, closeness centralities as well as shortest path analysis. gRINN is free and open to all users without login requirement at http://grinn.readthedocs.io.

  11. Mutational analysis to identify the residues essential for the inhibition of N-acetyl glutamate kinase of Corynebacterium glutamicum.

    Science.gov (United States)

    Huang, Yuanyuan; Zhang, Hao; Tian, Hongming; Li, Cheng; Han, Shuangyan; Lin, Ying; Zheng, Suiping

    2015-09-01

    N-acetyl glutamate kinase (NAGK) is a key enzyme in the synthesis of L-arginine that is inhibited by its end product L-arginine in Corynebacterium glutamicum (C. glutamicum). In this study, the potential binding sites of arginine and the residues essential for its inhibition were identified by homology modeling, inhibitor docking, and site-directed mutagenesis. The allosteric inhibition of NAGK was successfully alleviated by a mutation, as determined through analysis of mutant enzymes, which were overexpressed in vivo in C. glutamicum ATCC14067. Analysis of the mutant enzymes and docking analysis demonstrated that residue W23 positions an arginine molecule, and the interaction between arginine and residues L282, L283, and T284 may play an important role in the remote inhibitory process. Based on the results of the docking analysis of the effective mutants, we propose a linkage mechanism for the remote allosteric regulation of NAGK activity, in which residue R209 may play an essential role. In this study, the structure of the arginine-binding site of C. glutamicum NAGK (CgNAGK) was successfully predicted and the roles of the relevant residues were identified, providing new insight into the allosteric regulation of CgNAGK activity and a solid platform for the future construction of an optimized L-arginine producing strain.

  12. Development of a micrometre-scale radiographic measuring method for residual stress analysis

    International Nuclear Information System (INIS)

    Moeller, D.

    1999-01-01

    The radiographic method described uses micrometre X-ray diffraction for high-resolution residual stress analysis in single crystals. The focus is on application of two x-ray optics (glass capillaries) for shaping a sufficiently fine and intensive primary beam. Due to application of a proper one-grain measuring and analysis method, the resolution results are applicable to the characteristic grain sizes of many materials. (orig.) [de

  13. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  14. Nonparametric Change Point Diagnosis Method of Concrete Dam Crack Behavior Abnormality

    OpenAIRE

    Li, Zhanchao; Gu, Chongshi; Wu, Zhongru

    2013-01-01

    The study on diagnosis method of concrete crack behavior abnormality has always been a hot spot and difficulty in the safety monitoring field of hydraulic structure. Based on the performance of concrete dam crack behavior abnormality in parametric statistical model and nonparametric statistical model, the internal relation between concrete dam crack behavior abnormality and statistical change point theory is deeply analyzed from the model structure instability of parametric statistical model ...

  15. Large-scale evaluation of dynamically important residues in proteins predicted by the perturbation analysis of a coarse-grained elastic model

    Directory of Open Access Journals (Sweden)

    Tekpinar Mustafa

    2009-07-01

    Full Text Available Abstract Backgrounds It is increasingly recognized that protein functions often require intricate conformational dynamics, which involves a network of key amino acid residues that couple spatially separated functional sites. Tremendous efforts have been made to identify these key residues by experimental and computational means. Results We have performed a large-scale evaluation of the predictions of dynamically important residues by a variety of computational protocols including three based on the perturbation and correlation analysis of a coarse-grained elastic model. This study is performed for two lists of test cases with >500 pairs of protein structures. The dynamically important residues predicted by the perturbation and correlation analysis are found to be strongly or moderately conserved in >67% of test cases. They form a sparse network of residues which are clustered both in 3D space and along protein sequence. Their overall conservation is attributed to their dynamic role rather than ligand binding or high network connectivity. Conclusion By modeling how the protein structural fluctuations respond to residue-position-specific perturbations, our highly efficient perturbation and correlation analysis can be used to dissect the functional conformational changes in various proteins with a residue level of detail. The predictions of dynamically important residues serve as promising targets for mutational and functional studies.

  16. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  17. A Multi-Factor Analysis of Sustainable Agricultural Residue Removal Potential

    Energy Technology Data Exchange (ETDEWEB)

    Jared Abodeely; David Muth; Paul Adler; Eleanor Campbell; Kenneth Mark Bryden

    2012-10-01

    Agricultural residues have significant potential as a near term source of cellulosic biomass for bioenergy production, but sustainable removal of agricultural residues requires consideration of the critical roles that residues play in the agronomic system. Previous work has developed an integrated model to evaluate sustainable agricultural residue removal potential considering soil erosion, soil organic carbon, greenhouse gas emission, and long-term yield impacts of residue removal practices. The integrated model couples the environmental process models WEPS, RUSLE2, SCI, and DAYCENT. This study uses the integrated model to investigate the impact of interval removal practices in Boone County, Iowa, US. Residue removal of 4.5 Mg/ha was performed annually, bi-annually, and tri-annually and were compared to no residue removal. The study is performed at the soil type scale using a national soil survey database assuming a continuous corn rotation with reduced tillage. Results are aggregated across soil types to provide county level estimates of soil organic carbon changes and individual soil type soil organic matter content if interval residue removal were implemented. Results show interval residue removal is possible while improving soil organic matter. Implementation of interval removal practices provide greater increases in soil organic matter while still providing substantial residue for bioenergy production.

  18. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  19. Pesticide residue quantification analysis by hyperspectral imaging sensors

    Science.gov (United States)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  20. Microstructure and temperature dependence of intergranular strains on diffractometric macroscopic residual stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.N., E-mail: Julia.Wagner@kit.edu [KNMF, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Hofmann, M. [Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRM II), TU München, Lichtenbergstr. 1, 85747 Garching (Germany); Wimpory, R. [Helmholtz-Zentrum Berlin für Materialien und Energie, D-14109 Berlin Wannsee (Germany); Krempaszky, C. [Christian-Doppler-Labor für Werkstoffmechanik von Hochleistungslegierungen, TU München, Boltzmannstr. 15, 85747 Garching (Germany); Lehrstuhl für Werkstoffkunde und Werkstoffmechanik, TU München, Boltzmannstr. 15, 85747 Garching (Germany); Stockinger, M. [Böhler Schmiedetechnik GmbH and Co KG, Mariazeller Straße 25, 8605 Kapfenberg (Austria)

    2014-11-17

    Knowledge of the macroscopic residual stresses in components of complex high performance alloys is crucial when it comes to considering the safety and manufacturing aspects of components. Diffraction experiments are one of the key methods for studying residual stresses. However a component of the residual strain determined by diffraction experiments, known as microstrain or intergranular residual strain, occurs over the length scale of the grains and thus plays only a minor role for the life time of such components. For the reliable determination of macroscopic strains (with the minimum influence of these intergranular residual strains), the ISO standard recommends the use of particular Bragg reflections. Here we compare the build-up of intergranular strain of two different precipitation hardened IN 718 (INCONEL 718) samples, with identical chemical composition. Since intergranular strains are also affected by temperature, results from room temperature measurement are compared to results at T=550 °C. It turned out that microstructural parameters, such as grain size or type of precipitates, have a larger effect on the intergranular strain evolution than the influence of temperature at the measurement temperature of T=550 °C. The results also show that the choice of Bragg reflections for the diffractometric residual stress analysis is dependent not only on its chemical composition, but also on the microstructure of the sample. In addition diffraction elastic constants (DECs) for all measured Bragg reflections are given.

  1. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  2. Image Analysis to Estimate Mulch Residue in Soil

    Directory of Open Access Journals (Sweden)

    Carmen Moreno

    2014-01-01

    Full Text Available Mulching is used to improve the condition of agricultural soils by covering the soil with different materials, mainly black polyethylene (PE. However, problems derived from its use are how to remove it from the field and, in the case of it remaining in the soil, the possible effects on it. One possible solution is to use biodegradable plastic (BD or paper (PP, as mulch, which could present an alternative, reducing nonrecyclable waste and decreasing the environmental pollution associated with it. Determination of mulch residues in the ground is one of the basic requirements to estimate the potential of each material to degrade. This study has the goal of evaluating the residue of several mulch materials over a crop campaign in Central Spain through image analysis. Color images were acquired under similar lighting conditions at the experimental field. Different thresholding methods were applied to binarize the histogram values of the image saturation plane in order to show the best contrast between soil and mulch. Then the percentage of white pixels (i.e., soil area was used to calculate the mulch deterioration. A comparison of thresholding methods and the different mulch materials based on percentage of bare soil area obtained is shown.

  3. Analysis of shot-peening and residual stress relaxation in the nickel-based superalloy RR1000

    International Nuclear Information System (INIS)

    Foss, B.J.; Gray, S.; Hardy, M.C.; Stekovic, S.; McPhail, D.S.; Shollock, B.A.

    2013-01-01

    This work assesses the residual stress relaxation of the nickel-based alloy RR1000 due to thermal exposure and dwell-fatigue loading. A number of different characterization methods, including X-ray residual stress analysis, electron back-scattered diffraction, microhardness testing and focused ion beam secondary electron imaging, contributed to a detailed study of the shot-peened region. Thermal exposure at 700 °C resulted in a large reduction in the residual stresses and work-hardening effects in the alloy, but the subsurface remained in a beneficial compressive state. Oxidizing environments caused recrystallization in the near surface, but did not affect the residual stress-relaxation behaviour. Dwell-fatigue loading caused the residual stresses to return to approximately zero at nearly all depths. This work forms part of an ongoing investigation to determine the effects of shot-peening in this alloy with the motivation to improve the fatigue and oxidation resistance at 700 °C

  4. Ensemble Kalman filtering with residual nudging

    KAUST Repository

    Luo, X.

    2012-10-03

    Covariance inflation and localisation are two important techniques that are used to improve the performance of the ensemble Kalman filter (EnKF) by (in effect) adjusting the sample covariances of the estimates in the state space. In this work, an additional auxiliary technique, called residual nudging, is proposed to monitor and, if necessary, adjust the residual norms of state estimates in the observation space. In an EnKF with residual nudging, if the residual norm of an analysis is larger than a pre-specified value, then the analysis is replaced by a new one whose residual norm is no larger than a pre-specified value. Otherwise, the analysis is considered as a reasonable estimate and no change is made. A rule for choosing the pre-specified value is suggested. Based on this rule, the corresponding new state estimates are explicitly derived in case of linear observations. Numerical experiments in the 40-dimensional Lorenz 96 model show that introducing residual nudging to an EnKF may improve its accuracy and/or enhance its stability against filter divergence, especially in the small ensemble scenario.

  5. Prediction of residual stress distributions due to surface machining and welding and crack growth simulation under residual stress distribution

    International Nuclear Information System (INIS)

    Ihara, Ryohei; Katsuyama, JInya; Onizawa, Kunio; Hashimoto, Tadafumi; Mikami, Yoshiki; Mochizuki, Masahito

    2011-01-01

    Research highlights: → Residual stress distributions due to welding and machining are evaluated by XRD and FEM. → Residual stress due to machining shows higher tensile stress than welding near the surface. → Crack growth analysis is performed using calculated residual stress. → Crack growth result is affected machining rather than welding. → Machining is an important factor for crack growth. - Abstract: In nuclear power plants, stress corrosion cracking (SCC) has been observed near the weld zone of the core shroud and primary loop recirculation (PLR) pipes made of low-carbon austenitic stainless steel Type 316L. The joining process of pipes usually includes surface machining and welding. Both processes induce residual stresses, and residual stresses are thus important factors in the occurrence and propagation of SCC. In this study, the finite element method (FEM) was used to estimate residual stress distributions generated by butt welding and surface machining. The thermoelastic-plastic analysis was performed for the welding simulation, and the thermo-mechanical coupled analysis based on the Johnson-Cook material model was performed for the surface machining simulation. In addition, a crack growth analysis based on the stress intensity factor (SIF) calculation was performed using the calculated residual stress distributions that are generated by welding and surface machining. The surface machining analysis showed that tensile residual stress due to surface machining only exists approximately 0.2 mm from the machined surface, and the surface residual stress increases with cutting speed. The crack growth analysis showed that the crack depth is affected by both surface machining and welding, and the crack length is more affected by surface machining than by welding.

  6. Residual subsidence analysis after the end of coal mine work. Example from Lorraine Colliery, France

    International Nuclear Information System (INIS)

    Al Heib, M.; Nicolas, M.; Noirel, J.F.; Wojtkowiak, F.

    2005-01-01

    This paper describes the residual movements associated with the deep coal mines. The studied case relates to works located into Lorraine coal basin. The paper is divided into two sections. The first one describes subsidence phenomena, especially the residual phase in terms of amplitude, duration and localization. The second one focus on Morsbach case: the total and residual subsidence measurements will be analyzed and compared to the state of the art as well as the currant knowledge. The results of the analysis show that the duration of residual movements does not exceed 24 months and their amplitude is about 5% of total subsidence. We analyze also the declarations of the mining damage during and after the mining period. Damages occur, after this period are probably due to late observations. (authors)

  7. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  8. Adaptive nonparametric estimation for L\\'evy processes observed at low frequency

    OpenAIRE

    Kappus, Johanna

    2013-01-01

    This article deals with adaptive nonparametric estimation for L\\'evy processes observed at low frequency. For general linear functionals of the L\\'evy measure, we construct kernel estimators, provide upper risk bounds and derive rates of convergence under regularity assumptions. Our focus lies on the adaptive choice of the bandwidth, using model selection techniques. We face here a non-standard problem of model selection with unknown variance. A new approach towards this problem is proposed, ...

  9. Three-dimensional finite element analysis of residual magnetic field for ferromagnets under early damage

    International Nuclear Information System (INIS)

    Yao, Kai; Shen, Kai; Wang, Zheng-Dao; Wang, Yue-Sheng

    2014-01-01

    In this study, 3D finite element analysis is presented by calculating the residual magnetic field signals of ferromagnets under the plastic deformation. The contour maps of tangential and normal RMF gradients are given, and the 3D effect is discussed. The results show that the tangential peak–peak amplitude and normal peak–vale amplitude are remarkably different in 2D and 3D simulations, but the tangential peak–peak width and normal peak–vale width are similar. Moreover, some key points are capable of capturing the plastic-zone shape, especially when the lift-off is small enough. The present study suggests an effective defect identification method with Metal magnetic memory (MMM) technique. - Highlights: • Three-dimensional (3D) finite element analysis is presented by calculating the residual magnetic field signals of ferromagnets under the plastic deformation. • The contour maps of gradients of the tangential and normal residual magnetic fields are given, and the 3D effect is discussed. • The present study suggests an effective defect identification method with metal magnetic memory technique

  10. A nonparametric statistical method for determination of a confidence interval for the mean of a set of results obtained in a laboratory intercomparison

    International Nuclear Information System (INIS)

    Veglia, A.

    1981-08-01

    In cases where sets of data are obviously not normally distributed, the application of a nonparametric method for the estimation of a confidence interval for the mean seems to be more suitable than some other methods because such a method requires few assumptions about the population of data. A two-step statistical method is proposed which can be applied to any set of analytical results: elimination of outliers by a nonparametric method based on Tchebycheff's inequality, and determination of a confidence interval for the mean by a non-parametric method based on binominal distribution. The method is appropriate only for samples of size n>=10

  11. Nonparametric Integrated Agrometeorological Drought Monitoring: Model Development and Application

    Science.gov (United States)

    Zhang, Qiang; Li, Qin; Singh, Vijay P.; Shi, Peijun; Huang, Qingzhong; Sun, Peng

    2018-01-01

    Drought is a major natural hazard that has massive impacts on the society. How to monitor drought is critical for its mitigation and early warning. This study proposed a modified version of the multivariate standardized drought index (MSDI) based on precipitation, evapotranspiration, and soil moisture, i.e., modified multivariate standardized drought index (MMSDI). This study also used nonparametric joint probability distribution analysis. Comparisons were done between standardized precipitation evapotranspiration index (SPEI), standardized soil moisture index (SSMI), MSDI, and MMSDI, and real-world observed drought regimes. Results indicated that MMSDI detected droughts that SPEI and/or SSMI failed to do. Also, MMSDI detected almost all droughts that were identified by SPEI and SSMI. Further, droughts detected by MMSDI were similar to real-world observed droughts in terms of drought intensity and drought-affected area. When compared to MMSDI, MSDI has the potential to overestimate drought intensity and drought-affected area across China, which should be attributed to exclusion of the evapotranspiration components from estimation of drought intensity. Therefore, MMSDI is proposed for drought monitoring that can detect agrometeorological droughts. Results of this study provide a framework for integrated drought monitoring in other regions of the world and can help to develop drought mitigation.

  12. Analysis of residual solvents in PET radiopharmaceuticals by GC

    International Nuclear Information System (INIS)

    Li Yungang; Zhang Xiaojun; Liu Jian; Tian Jiahe; Zhang Jinming

    2013-01-01

    The residual solvents in PET radiopharmaceuticals were analyzed by GC, which were acetonitrile, ethanol, N, N-dimethylethanolamine (DMEA), dimethylsulfoxide (DMSO). The standard curves were established with the AT-624 capillary column at GC, and the sensitivity of acetonitrile and ethanol were 0.004-0.320 g/L and 0.010-0.120 g/L respectively. The residual solvents of acetonitrile, ethanol, DMEA and DMSO in PET radio- pharmaceuticals were analyzed by GC. The linearity were 0.9994, 0.9999, 0.9997, 0.999 6 respectively. The residual of acetonitrile were (0.0313±0.0433), (0.0829±0.0668), (0.0156±0.0059), (0.0254±0.0266) g/L in 18 F-FDG, 18 F-FLT, 11 C-CFT, 11 C-PIB respectively. The residual of ethanol was (0.0505±0.00528) g/L in 18 F-FDG. The residual of DMSO were (0.0331±0.0180) g/L, (0.0238±0.0100) g/L in 18 F-W372 and 11 C-DTBZ respectively. The residual of DMEA was (0.0348±0.0022) g/L in 11 C-Choline. The survived of organic solvent in PET radiopharmaceuticals can be analyzed with GC directly. The results showed that the QC should be done in PET radiopharmaceuticals purity with semi-HPLC to avoid the high residual. (authors)

  13. A Non-Parametric Delphi Approach to Foster Innovation Policy Debate in Spain

    Directory of Open Access Journals (Sweden)

    Juan Carlos Salazar-Elena

    2016-05-01

    Full Text Available The aim of this paper is to identify some changes needed in Spain’s innovation policy to fill the gap between its innovation results and those of other European countries in lieu of sustainable leadership. To do this we apply the Delphi methodology to experts from academia, business, and government. To overcome the shortcomings of traditional descriptive methods, we develop an inferential analysis by following a non-parametric bootstrap method which enables us to identify important changes that should be implemented. Particularly interesting is the support found for improving the interconnections among the relevant agents of the innovation system (instead of focusing exclusively in the provision of knowledge and technological inputs through R and D activities, or the support found for “soft” policy instruments aimed at providing a homogeneous framework to assess the innovation capabilities of firms (e.g., for funding purposes. Attention to potential innovators among small and medium enterprises (SMEs and traditional industries is particularly encouraged by experts.

  14. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Directory of Open Access Journals (Sweden)

    Xiaochun Sun

    Full Text Available Genomic selection (GS procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA and reproducing kernel Hilbert spaces (RKHS regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  15. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Science.gov (United States)

    Sun, Xiaochun; Ma, Ping; Mumm, Rita H

    2012-01-01

    Genomic selection (GS) procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA) and reproducing kernel Hilbert spaces (RKHS) regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC) and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  16. Separating environmental efficiency into production and abatement efficiency. A nonparametric model with application to U.S. power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin

    2011-08-15

    In this paper we present a new approach to evaluate the environmental efficiency of decision making units. We propose a model that describes a two-stage process consisting of a production and an end-of-pipe abatement stage with the environmental efficiency being determined by the efficiency of both stages. Taking the dependencies between the two stages into account, we show how nonparametric methods can be used to measure environmental efficiency and to decompose it into production and abatement efficiency. For an empirical illustration we apply our model to an analysis of U.S. power plants.

  17. Distribution patterns of firearm discharge residues as revealed by neutron activation analysis

    International Nuclear Information System (INIS)

    Pillay, K.K.S.; Driscoll, D.C.; Jester, W.A.

    1975-01-01

    A systematic investigation using a variety of handguns has revealed the existence of distinguisable distribution patterns of firearm discharge residues on surfaces below the flight path of a bullet. The residues are identificable even at distances of 12 meters from the gun using nondestructive neutron activation analysis. The results of these investigations show that the distribution pattern for a gun is reproducible using similar ammunition and that there exist two distinct regions to the patterns developed between the firearm and the target-one with respect to the position of the gun and the other in the vicinity of the target. The judicious applications of these findings could be of significant value in criminal investigations. (T.G.)

  18. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  19. MEASURING DARK MATTER PROFILES NON-PARAMETRICALLY IN DWARF SPHEROIDALS: AN APPLICATION TO DRACO

    International Nuclear Information System (INIS)

    Jardel, John R.; Gebhardt, Karl; Fabricius, Maximilian H.; Williams, Michael J.; Drory, Niv

    2013-01-01

    We introduce a novel implementation of orbit-based (or Schwarzschild) modeling that allows dark matter density profiles to be calculated non-parametrically in nearby galaxies. Our models require no assumptions to be made about velocity anisotropy or the dark matter profile. The technique can be applied to any dispersion-supported stellar system, and we demonstrate its use by studying the Local Group dwarf spheroidal galaxy (dSph) Draco. We use existing kinematic data at larger radii and also present 12 new radial velocities within the central 13 pc obtained with the VIRUS-W integral field spectrograph on the 2.7 m telescope at McDonald Observatory. Our non-parametric Schwarzschild models find strong evidence that the dark matter profile in Draco is cuspy for 20 ≤ r ≤ 700 pc. The profile for r ≥ 20 pc is well fit by a power law with slope α = –1.0 ± 0.2, consistent with predictions from cold dark matter simulations. Our models confirm that, despite its low baryon content relative to other dSphs, Draco lives in a massive halo.

  20. Panel data nonparametric estimation of production risk and risk preferences

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    approaches for obtaining firm-specific measures of risk attitudes. We found that Polish dairy farmers are risk averse regarding production risk and price uncertainty. According to our results, Polish dairy farmers perceive the production risk as being more significant than the risk related to output price......We apply nonparametric panel data kernel regression to investigate production risk, out-put price uncertainty, and risk attitudes of Polish dairy farms based on a firm-level unbalanced panel data set that covers the period 2004–2010. We compare different model specifications and different...

  1. Thermal decomposition characteristics of microwave liquefied rape straw residues using thermogravimetric analysis

    Science.gov (United States)

    Xingyan Huang; Cornelis F. De Hoop; Jiulong Xie; Chung-Yun Hse; Jinqiu Qi; Yuzhu Chen; Feng Li

    2017-01-01

    The thermal decomposition characteristics of microwave liquefied rape straw residues with respect to liquefaction condition and pyrolysis conversion were investigated using a thermogravimetric (TG) analyzer at the heating rates of 5, 20, 50 °C min-1. The hemicellulose decomposition peak was absent at the derivative thermogravimetric analysis (DTG...

  2. Pyrolysis kinetics and thermodynamic parameters of castor (Ricinus communis) residue using thermogravimetric analysis.

    Science.gov (United States)

    Kaur, Ravneet; Gera, Poonam; Jha, Mithilesh Kumar; Bhaskar, Thallada

    2018-02-01

    Castor plant is a fast-growing, perennial shrub from Euphorbiaceae family. More than 50% of the residue is generated from its stems and leaves. The main aim of this work is to study the pyrolytic characteristics, kinetics and thermodynamic properties of castor residue. The TGA experiments were carried out from room temperature to 900 °C under an inert atmosphere at different heating rates of 5, 10, 15, 20, 30 and 40 °C/min. The kinetic analysis was carried using different models namely Kissinger, Flynn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS). The average E ɑ calculated by FWO and KAS methods were 167.10 and 165.86 kJ/mole respectively. Gibbs free energy varied from 150.62-154.33 to 150.59-154.65 kJ/mol for FWO and KAS respectively. The HHV of castor residue was 14.43 MJ/kg, considered as potential feedstock for bio-energy production. Kinetic and thermodynamic results will be useful input for the design of pyrolytic process using castor residue as feedstock. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A Bayesian nonparametric approach to causal inference on quantiles.

    Science.gov (United States)

    Xu, Dandan; Daniels, Michael J; Winterstein, Almut G

    2018-02-25

    We propose a Bayesian nonparametric approach (BNP) for causal inference on quantiles in the presence of many confounders. In particular, we define relevant causal quantities and specify BNP models to avoid bias from restrictive parametric assumptions. We first use Bayesian additive regression trees (BART) to model the propensity score and then construct the distribution of potential outcomes given the propensity score using a Dirichlet process mixture (DPM) of normals model. We thoroughly evaluate the operating characteristics of our approach and compare it to Bayesian and frequentist competitors. We use our approach to answer an important clinical question involving acute kidney injury using electronic health records. © 2018, The International Biometric Society.

  4. Effects of food processing on pesticide residues in fruits and vegetables: a meta-analysis approach.

    Science.gov (United States)

    Keikotlhaile, B M; Spanoghe, P; Steurbaut, W

    2010-01-01

    Pesticides are widely used in food production to increase food security despite the fact that they can have negative health effects on consumers. Pesticide residues have been found in various fruits and vegetables; both raw and processed. One of the most common routes of pesticide exposure in consumers is via food consumption. Most foods are consumed after passing through various culinary and processing treatments. A few literature reviews have indicated the general trend of reduction or concentration of pesticide residues by certain methods of food processing for a particular active ingredient. However, no review has focused on combining the obtained results from different studies on different active ingredients with differences in experimental designs, analysts and analysis equipment. In this paper, we present a meta-analysis of response ratios as a possible method of combining and quantifying effects of food processing on pesticide residue levels. Reduction of residue levels was indicated by blanching, boiling, canning, frying, juicing, peeling and washing of fruits and vegetables with an average response ratio ranging from 0.10 to 0.82. Baking, boiling, canning and juicing indicated both reduction and increases for the 95% and 99.5% confidence intervals. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Ensemble Kalman filtering with residual nudging

    Directory of Open Access Journals (Sweden)

    Xiaodong Luo

    2012-10-01

    Full Text Available Covariance inflation and localisation are two important techniques that are used to improve the performance of the ensemble Kalman filter (EnKF by (in effect adjusting the sample covariances of the estimates in the state space. In this work, an additional auxiliary technique, called residual nudging, is proposed to monitor and, if necessary, adjust the residual norms of state estimates in the observation space. In an EnKF with residual nudging, if the residual norm of an analysis is larger than a pre-specified value, then the analysis is replaced by a new one whose residual norm is no larger than a pre-specified value. Otherwise, the analysis is considered as a reasonable estimate and no change is made. A rule for choosing the pre-specified value is suggested. Based on this rule, the corresponding new state estimates are explicitly derived in case of linear observations. Numerical experiments in the 40-dimensional Lorenz 96 model show that introducing residual nudging to an EnKF may improve its accuracy and/or enhance its stability against filter divergence, especially in the small ensemble scenario.

  6. Residual Structures in Latent Growth Curve Modeling

    Science.gov (United States)

    Grimm, Kevin J.; Widaman, Keith F.

    2010-01-01

    Several alternatives are available for specifying the residual structure in latent growth curve modeling. Two specifications involve uncorrelated residuals and represent the most commonly used residual structures. The first, building on repeated measures analysis of variance and common specifications in multilevel models, forces residual variances…

  7. Using nonparametrics to specify a model to measure the value of travel time

    DEFF Research Database (Denmark)

    Fosgerau, Mogens

    2007-01-01

    Using a range of nonparametric methods, the paper examines the specification of a model to evaluate the willingness-to-pay (WTP) for travel time changes from binomial choice data from a simple time-cost trading experiment. The analysis favours a model with random WTP as the only source...... of randomness over a model with fixed WTP which is linear in time and cost and has an additive random error term. Results further indicate that the distribution of log WTP can be described as a sum of a linear index fixing the location of the log WTP distribution and an independent random variable representing...... unobserved heterogeneity. This formulation is useful for parametric modelling. The index indicates that the WTP varies systematically with income and other individual characteristics. The WTP varies also with the time difference presented in the experiment which is in contradiction of standard utility theory....

  8. Debt and growth: A non-parametric approach

    Science.gov (United States)

    Brida, Juan Gabriel; Gómez, David Matesanz; Seijas, Maria Nela

    2017-11-01

    In this study, we explore the dynamic relationship between public debt and economic growth by using a non-parametric approach based on data symbolization and clustering methods. The study uses annual data of general government consolidated gross debt-to-GDP ratio and gross domestic product for sixteen countries between 1977 and 2015. Using symbolic sequences, we introduce a notion of distance between the dynamical paths of different countries. Then, a Minimal Spanning Tree and a Hierarchical Tree are constructed from time series to help detecting the existence of groups of countries sharing similar economic performance. The main finding of the study appears for the period 2008-2016 when several countries surpassed the 90% debt-to-GDP threshold. During this period, three groups (clubs) of countries are obtained: high, mid and low indebted countries, suggesting that the employed debt-to-GDP threshold drives economic dynamics for the selected countries.

  9. Comparação de duas metodologias de amostragem atmosférica com ferramenta estatística não paramétrica Comparison of two atmospheric sampling methodologies with non-parametric statistical tools

    Directory of Open Access Journals (Sweden)

    Maria João Nunes

    2005-03-01

    Full Text Available In atmospheric aerosol sampling, it is inevitable that the air that carries particles is in motion, as a result of both externally driven wind and the sucking action of the sampler itself. High or low air flow sampling speeds may lead to significant particle size bias. The objective of this work is the validation of measurements enabling the comparison of species concentration from both air flow sampling techniques. The presence of several outliers and increase of residuals with concentration becomes obvious, requiring non-parametric methods, recommended for the handling of data which may not be normally distributed. This way, conversion factors are obtained for each of the various species under study using Kendall regression.

  10. The application of white radiation to residual stress analysis in the intermediate zone between surface and volume

    CERN Document Server

    Genzel, C; Wallis, B; Reimers, W

    2001-01-01

    Mechanical surface processing is known to give rise to complex residual stress fields in the near surface region of polycrystalline materials. Consequently, their analysis by means of non-destructive X-ray and neutron diffraction methods has become an important topic in materials science. However, there remains a gap with respect to the accessible near surface zone, which concerns a range between about 10 mu m and 1 mm, where the conventional X-ray methods are no longer and the neutron methods are not yet sensitive. In order to achieve the necessary penetration depth tau to perform residual stress analysis (RSA) in this region, advantageous use can be made of energy dispersive X-ray diffraction of synchrotron radiation (15-60 keV) in the reflection mode. Besides an example concerning the adaptation of methods applied so far in the angle dispersive RSA to the energy dispersive case, the concept of a new materials science beamline at BESSY II for residual stress and texture analysis is presented.

  11. The application of white radiation to residual stress analysis in the intermediate zone between surface and volume

    International Nuclear Information System (INIS)

    Genzel, Ch.; Stock, C.; Wallis, B.; Reimers, W.

    2001-01-01

    Mechanical surface processing is known to give rise to complex residual stress fields in the near surface region of polycrystalline materials. Consequently, their analysis by means of non-destructive X-ray and neutron diffraction methods has become an important topic in materials science. However, there remains a gap with respect to the accessible near surface zone, which concerns a range between about 10 μm and 1 mm, where the conventional X-ray methods are no longer and the neutron methods are not yet sensitive. In order to achieve the necessary penetration depth τ to perform residual stress analysis (RSA) in this region, advantageous use can be made of energy dispersive X-ray diffraction of synchrotron radiation (15-60 keV) in the reflection mode. Besides an example concerning the adaptation of methods applied so far in the angle dispersive RSA to the energy dispersive case, the concept of a new materials science beamline at BESSY II for residual stress and texture analysis is presented

  12. Image Analysis to Estimate Mulch Residual on Soil

    Science.gov (United States)

    Moreno Valencia, Carmen; Moreno Valencia, Marta; Tarquis, Ana M.

    2014-05-01

    Organic farmers are currently allowed to use conventional polyethylene mulch, provided it is removed from the field at the end of the growing or harvest season. To some, such use represents a contradiction between the resource conservation goals of sustainable, organic agriculture and the waste generated from the use of polyethylene mulch. One possible solution is to use biodegradable plastic or paper as mulch, which could present an alternative to polyethylene in reducing non-recyclable waste and decreasing the environmental pollution associated with it. Determination of mulch residues on the ground is one of the basic requisites to estimate the potential of each material to degrade. Determination the extent of mulch residue on the field is an exhausting job while there is not a distinct and accurate criterion for its measurement. There are several indices for estimation the residue covers while most of them are not only laborious and time consuming but also impressed by human errors. Human vision system is fast and accurate enough in this case but the problem is that the magnitude must be stated numerically to be reported and to be used for comparison between several mulches or mulches in different times. Interpretation of the extent perceived by vision system to numerals is possible by simulation of human vision system. Machine vision comprising image processing system can afford these jobs. This study aimed to evaluate the residue of mulch materials over a crop campaign in a processing tomato (Solanum lycopersicon L.) crop in Central Spain through image analysis. The mulch materials used were standard black polyethylene (PE), two biodegradable plastic mulches (BD1 and BD2), and one paper (PP1) were compared. Meanwhile the initial appearance of most of the mulches was sort of black PE, at the end of the experiment the materials appeared somewhat discoloured, soil and/or crop residue was impregnated being very difficult to completely remove them. A digital camera

  13. Analysis of residual toluene in food packaging via headspace extraction method using gas chromatography

    International Nuclear Information System (INIS)

    Lim, Ying Chin; Mohd Marsin Sanagi

    2008-01-01

    Polymeric materials are used in many food contact applications as packaging material. The presence of residual toluene in this food packaging material can migrate into food and thus affect the quality of food. In this study, a manual headspace analysis was successfully designed and developed. The determination of residual toluene was carried out with standard addition method and multiple headspace extraction, MHE) method using gas chromatography-flame ionization detector, GC-FID). Identification of toluene was performed by comparison of its retention time with standard toluene and GC-MS. It was found that the suitable heating temperature was 180 degree Celsius with an optimum heating time of 10 minutes. The study also found that the concentration of residual toluene in multicolored sample was higher compared to mono colored sample whereas residual toluene in sample analyzed using standard addition method was higher compared to MHE method. However, comparison with the results obtained from De Paris laboratory, France found that MHE method gave higher accuracy for sample with low analyte concentration. On the other hand, lower accuracy was obtained for sample with high concentration of residual toluene due to systematic errors. Comparison between determination methods showed that MHE method is more precise compared to standard addition method. (author)

  14. The finite element analysis for prediction of residual stresses induced by shot peening

    International Nuclear Information System (INIS)

    Kim, Cheol; Yang, Won Ho; Sung, Ki Deug; Cho, Myoung Rae; Ko, Myung Hoon

    2000-01-01

    The shot peening is largely used for a surface treatment in which small spherical parts called shots are blasted on a surface of a metallic components with velocities up to 100m/s. This treatment leads to an improvement of fatigue behavior due to the developed compressive residual stresses, and so it has gained widespread acceptance in the automobile and aerospace industries. The residual stress profile on surface layer depends on the parameters of shot peening, which are, shot velocity, shot diameter, coverage, impact angle, material properties etc. and the method to confirm this profile is only measurement by X-ray diffractometer. Despite its importance to automobile and aerospace industries, little attention has been devoted to the accurate modeling of the process. In this paper, the simulation technique is applied to predict the magnitude and distribution of the residual stress and plastic deformation caused by shot peening with the help of the finite element analysis

  15. Residual strain sensor using Al-packaged optical fiber and Brillouin optical correlation domain analysis.

    Science.gov (United States)

    Choi, Bo-Hun; Kwon, Il-Bum

    2015-03-09

    We propose a distributed residual strain sensor that uses an Al-packaged optical fiber for the first time. The residual strain which causes Brillouin frequency shifts in the optical fiber was measured using Brillouin optical correlation domain analysis with 2 cm spatial resolution. We quantified the Brillouin frequency shifts in the Al-packaged optical fiber by the tensile stress and compared them for a varying number of Al layers in the optical fiber. The Brillouin frequency shift of an optical fiber with one Al layer had a slope of 0.038 MHz/με with respect to tensile stress, which corresponds to 78% of that for an optical fiber without Al layers. After removal of the stress, 87% of the strain remained as residual strain. When different tensile stresses were randomly applied, the strain caused by the highest stress was the only one detected as residual strain. The residual strain was repeatedly measured for a time span of nine months for the purpose of reliability testing, and there was no change in the strain except for a 4% reduction, which is within the error tolerance of the experiment. A composite material plate equipped with our proposed Al-packaged optical fiber sensor was hammered for impact experiment and the residual strain in the plate was successfully detected. We suggest that the Al-packaged optical fiber can be adapted as a distributed strain sensor for smart structures, including aerospace structures.

  16. Effectiveness of the compound chlorpyrifos+ cypermethrin+citronellal against Alphitobius diaperinus: laboratory analysis and residue determination in carcasses

    Directory of Open Access Journals (Sweden)

    GS Silva

    2007-09-01

    Full Text Available Effectiveness, biological security and the absence of residues in meat and/or eggs must be considered when recommending options for the control Alphitobius diaperinus in poultry production environments. This research study evaluated the effectiveness of cypermethrin+ chlorpyrifos+citronellal in the control of A. diaperinus, including analysis for the presence of residues of this compound in poultry carcasses (experimental farm. Two studies were carried out under laboratory conditions. One used paper filters a four dilutions of the compound, and the other used a container including with pulverized broiler litter and the compound. The analysis of carcasses for residues was conducted in broilers that raised in a broiler house treated (floor and/or litter with the compound at a dilution of 1:800. Birds were regularly sacrificed, submitted to necropsy, and liver, muscle and fat fragments were collected. Gas chromatography was used to identify the possible presence of any chemical residue in these samples. High effectiveness rates against A.diaperinus were observed in the two laboratory studies, as well as the absence of residues in the carcasses. This compound, used in the studied concentrations, can be recommended as a valuable alternative for the control and treatment of A. diaperinus.

  17. Residue Geometry Networks: A Rigidity-Based Approach to the Amino Acid Network and Evolutionary Rate Analysis

    Science.gov (United States)

    Fokas, Alexander S.; Cole, Daniel J.; Ahnert, Sebastian E.; Chin, Alex W.

    2016-01-01

    Amino acid networks (AANs) abstract the protein structure by recording the amino acid contacts and can provide insight into protein function. Herein, we describe a novel AAN construction technique that employs the rigidity analysis tool, FIRST, to build the AAN, which we refer to as the residue geometry network (RGN). We show that this new construction can be combined with network theory methods to include the effects of allowed conformal motions and local chemical environments. Importantly, this is done without costly molecular dynamics simulations required by other AAN-related methods, which allows us to analyse large proteins and/or data sets. We have calculated the centrality of the residues belonging to 795 proteins. The results display a strong, negative correlation between residue centrality and the evolutionary rate. Furthermore, among residues with high closeness, those with low degree were particularly strongly conserved. Random walk simulations using the RGN were also successful in identifying allosteric residues in proteins involved in GPCR signalling. The dynamic function of these residues largely remain hidden in the traditional distance-cutoff construction technique. Despite being constructed from only the crystal structure, the results in this paper suggests that the RGN can identify residues that fulfil a dynamical function. PMID:27623708

  18. Ethanol production process from banana fruit and its lignocellulosic residues: Energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Velasquez-Arredondo, H.I. [Grupo de Investigacion Bioprocesos y Flujos Reactivos, Universidad Nacional de Colombia, Sede Medellin, Calle 59 A N 63-20 (Colombia); Departamento de Engenharia Mecanica, Escola Politecnica, Universidade de Sao Paulo, Avenida Professor Mello Moraes 2231 (Brazil); Ruiz-Colorado, A.A. [Grupo de Investigacion Bioprocesos y Flujos Reactivos, Universidad Nacional de Colombia, Sede Medellin, Calle 59 A N 63-20 (Colombia); De Oliveira, S. Jr. [Departamento de Engenharia Mecanica, Escola Politecnica, Universidade de Sao Paulo, Avenida Professor Mello Moraes 2231 (Brazil)

    2010-07-15

    Tropical countries, such as Brazil and Colombia, have the possibility of using agricultural lands for growing biomass to produce bio-fuels such as biodiesel and ethanol. This study applies an energy analysis to the production process of anhydrous ethanol obtained from the hydrolysis of starch and cellulosic and hemicellulosic material present in the banana fruit and its residual biomass. Four different production routes were analyzed: acid hydrolysis of amylaceous material (banana pulp and banana fruit) and enzymatic hydrolysis of lignocellulosic material (flower stalk and banana skin). The analysis considered banana plant cultivation, feedstock transport, hydrolysis, fermentation, distillation, dehydration, residue treatment and utility plant. The best indexes were obtained for amylaceous material for which mass performance varied from 346.5 L/t to 388.7 L/t, Net Energy Value (NEV) ranged from 9.86 MJ/L to 9.94 MJ/L and the energy ratio was 1.9 MJ/MJ. For lignocellulosic materials, the figures were less favorable; mass performance varied from 86.1 to 123.5 L/t, NEV from 5.24 to 8.79 MJ/L and energy ratio from 1.3 to 1.6 MJ/MJ. The analysis showed, however, that both processes can be considered energetically feasible. (author)

  19. Nonparametric estimation in an "illness-death" model when all transition times are interval censored

    DEFF Research Database (Denmark)

    Frydman, Halina; Gerds, Thomas; Grøn, Randi

    2013-01-01

    We develop nonparametric maximum likelihood estimation for the parameters of an irreversible Markov chain on states {0,1,2} from the observations with interval censored times of 0 → 1, 0 → 2 and 1 → 2 transitions. The distinguishing aspect of the data is that, in addition to all transition times ...

  20. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian u...

  1. Experimental Sentinel-2 LAI estimation using parametric, non-parametric and physical retrieval methods - A comparison

    NARCIS (Netherlands)

    Verrelst, Jochem; Rivera, Juan Pablo; Veroustraete, Frank; Muñoz-Marí, Jordi; Clevers, J.G.P.W.; Camps-Valls, Gustau; Moreno, José

    2015-01-01

    Given the forthcoming availability of Sentinel-2 (S2) images, this paper provides a systematic comparison of retrieval accuracy and processing speed of a multitude of parametric, non-parametric and physically-based retrieval methods using simulated S2 data. An experimental field dataset (SPARC),

  2. Improving salt marsh digital elevation model accuracy with full-waveform lidar and nonparametric predictive modeling

    Science.gov (United States)

    Rogers, Jeffrey N.; Parrish, Christopher E.; Ward, Larry G.; Burdick, David M.

    2018-03-01

    Salt marsh vegetation tends to increase vertical uncertainty in light detection and ranging (lidar) derived elevation data, often causing the data to become ineffective for analysis of topographic features governing tidal inundation or vegetation zonation. Previous attempts at improving lidar data collected in salt marsh environments range from simply computing and subtracting the global elevation bias to more complex methods such as computing vegetation-specific, constant correction factors. The vegetation specific corrections can be used along with an existing habitat map to apply separate corrections to different areas within a study site. It is hypothesized here that correcting salt marsh lidar data by applying location-specific, point-by-point corrections, which are computed from lidar waveform-derived features, tidal-datum based elevation, distance from shoreline and other lidar digital elevation model based variables, using nonparametric regression will produce better results. The methods were developed and tested using full-waveform lidar and ground truth for three marshes in Cape Cod, Massachusetts, U.S.A. Five different model algorithms for nonparametric regression were evaluated, with TreeNet's stochastic gradient boosting algorithm consistently producing better regression and classification results. Additionally, models were constructed to predict the vegetative zone (high marsh and low marsh). The predictive modeling methods used in this study estimated ground elevation with a mean bias of 0.00 m and a standard deviation of 0.07 m (0.07 m root mean square error). These methods appear very promising for correction of salt marsh lidar data and, importantly, do not require an existing habitat map, biomass measurements, or image based remote sensing data such as multi/hyperspectral imagery.

  3. 77 FR 24671 - Compliance Guide for Residue Prevention and Agency Testing Policy for Residues

    Science.gov (United States)

    2012-04-25

    ... Hazard Analysis and Critical Control Points (HACCP) inspection system, another important component of the NRP is to provide verification of residue control in HACCP systems. As part of the HACCP regulation... guide, and FSIS finds violative residues, the establishment's HACCP system may be inadequate under 9 CFR...

  4. Development of an enzyme-linked immunosorbent assay for residue analysis of the insecticide emamectin benzoate in agricultural products.

    Science.gov (United States)

    Kondo, Mika; Yamashita, Hiroshi; Uchigashima, Mikiko; Kono, Takeshi; Takemoto, Toshihide; Fujita, Masahiro; Saka, Machiko; Iwasa, Seiji; Ito, Shigekazu; Miyake, Shiro

    2009-01-28

    A direct competitive enzyme-linked immunosorbent assay (dc-ELISA) for the analysis of emamectin residues in agricultural products was developed using a prepared mouse monoclonal antibody. The working range was 0.3-3.0 ng/mL, and the 50% inhibition concentration (IC(50)) was 1.0 ng/mL. The assay was sufficiently sensitive for analysis of the maximum residue limits in agricultural products in Japan (>0.1 microg/g). Emamectin residues contain the following metabolites: the 4''-epi-amino analogue, the 4''-epi-(N-formyl)amino analogue, the 4''-epi-(N-formyl-N-methyl)amino analogue, and the 8,9-Z isomer. The dc-ELISA reacted with these compounds at ratios of 113, 55, 38, and 9.1% of the IC(50) value of emamectin benzoate. Seven kinds of vegetables were spiked with emamectin benzoate at concentrations of 15-300 ng/g, and the recoveries were 91-117% in the dc-ELISA. The dc-ELISA results agreed reasonably well with results obtained by liquid chromatography-tandem mass spectrometry (LC-MS/MS) using spiked samples and actual (incurred) samples. The results indicate that the dc-ELISA was useful for the analysis of emamectin benzoate residues in agricultural products.

  5. Nonparametric predictive pairwise comparison with competing risks

    International Nuclear Information System (INIS)

    Coolen-Maturi, Tahani

    2014-01-01

    In reliability, failure data often correspond to competing risks, where several failure modes can cause a unit to fail. This paper presents nonparametric predictive inference (NPI) for pairwise comparison with competing risks data, assuming that the failure modes are independent. These failure modes could be the same or different among the two groups, and these can be both observed and unobserved failure modes. NPI is a statistical approach based on few assumptions, with inferences strongly based on data and with uncertainty quantified via lower and upper probabilities. The focus is on the lower and upper probabilities for the event that the lifetime of a future unit from one group, say Y, is greater than the lifetime of a future unit from the second group, say X. The paper also shows how the two groups can be compared based on particular failure mode(s), and the comparison of the two groups when some of the competing risks are combined is discussed

  6. Residual analysis applied to S-N data of a surface rolled cast iron

    Directory of Open Access Journals (Sweden)

    Omar Maluf

    2005-09-01

    Full Text Available Surface rolling is a process extensively employed in the manufacture of ductile cast iron crankshafts, specifically in regions containing stress concentrators with the main aim to enhance fatigue strength. Such process hardens and introduces compressive residual stresses to the surface as a result of controlled strains, reducing cyclic tensile stresses near the surface of the part. The main purpose of this work was to apply the residual analysis to check the suitability of the S-N approach to describe the fatigue properties of a surface rolled cast iron. The analysis procedure proved to be very efficient and easy to implement and it can be applied in the verification of any other statistical model used to describe fatigue behavior. Results show that the conventional S-N methodology is able to model the high cycle fatigue behavior of surface rolled notch testpieces of a pearlitic ductile cast iron submitted to rotating bending fatigue tests.

  7. Fracture mechanics and residual fatigue life analysis for complex stress fields. Technical report

    International Nuclear Information System (INIS)

    Besuner, P.M.

    1975-07-01

    This report reviews the development and application of an influence function method for calculating stress intensity factors and residual fatigue life for two- and three-dimensional structures with complex stress fields and geometries. Through elastic superposition, the method properly accounts for redistribution of stress as the crack grows through the structure. The analytical methods used and the computer programs necessary for computation and application of load independent influence functions are presented. A new exact solution is obtained for the buried elliptical crack, under an arbitrary Mode I stress field, for stress intensity factors at four positions around the crack front. The IF method is then applied to two fracture mechanics problems with complex stress fields and geometries. These problems are of current interest to the electric power generating industry and include (1) the fatigue analysis of a crack in a pipe weld under nominal and residual stresses and (2) fatigue analysis of a reactor pressure vessel nozzle corner crack under a complex bivariate stress field

  8. Residual Stress Analysis of Aircraft Part using Neutron Beam

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Eun Joo; Seong, Baek Seok; Sim, Cheul Muu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    A precise measurement of the residual stress magnitude and distribution is an important factor to evaluate the lifetime or safety of the materials, because the residual stress affects the material properties, such as the strength, fatigue, etc. In the case of a fighter jet, the lifetime and safety of the parts of the landing gear are more important than that of a passenger airplane because of its frequent take offs and landings. In particular in the case of training a fighter jet, a precise evaluation of life time for the parts of the landing gear is strongly required for economic reason. In this study, the residual stress of a part of the landing gear of the training fighter jet which is used to fix the landing gear to the aircraft body was investigated. The part was used for 2000 hours of flight, which corresponds to 10 years. During this period, the fighter jet normally takes off and lands more than 2000 times. These frequent take off and landing can generate residual stress and cause a crack in the part. By measuring the neutron diffraction peaks, we evaluated the residual stress of the landing gear part

  9. Simplified RP-HPLC method for multi-residue analysis of abamectin, emamectin benzoate and ivermectin in rice.

    Science.gov (United States)

    Xie, Xianchuan; Gong, Shu; Wang, Xiaorong; Wu, Yinxing; Zhao, Li

    2011-01-01

    A rapid, reliable and sensitive reverse-phase high-performance liquid chromatography method with fluorescence detection (RP-FLD-HPLC) was developed and validated for simultaneous analysis of the abamectin (ABA), emamectin (EMA) benzoate and ivermectin (IVM) residues in rice. After extraction with acetonitrile/water (2 : 1) with sonication, the avermectin (AVMs) residues were directly derivatised by N-methylimidazole (N-NMIM) and trifluoroacetic anhydride (TFAA) and then analysed on RP-FLD-HPLC. A good linear relationship (r(2 )> 0.99) was obtained for three AVMs ranging from 0.01 to 5 microg ml(-1), i.e. 0.01-5.0 microg g(-1) in rice matrix. The limit of detection (LOD) and the limit of quantification (LOQ) were between 0.001 and 0.002 microg g(-1) and between 0.004 and 0.006 microg g(-1), respectively. Recoveries were from 81.9% to 105.4% and precision less than 12.4%. The proposed method was successfully applied to routine analysis of the AVMs residues in rice.

  10. An Instructional Module on Mokken Scale Analysis

    Science.gov (United States)

    Wind, Stefanie A.

    2017-01-01

    Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…

  11. Analysis of explosive and other organic residues by laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Lazic, V., E-mail: lazic@frascati.enea.i [ENEA, FIS-LAS, Via. E. Fermi 45, 00044 Frascati (RM) (Italy); Palucci, A. [ENEA, FIS-LAS, Via. E. Fermi 45, 00044 Frascati (RM) (Italy); Jovicevic, S. [Institute of Physics, 11080 Belgrade, Pregrevica 118 (Serbia); Poggi, C.; Buono, E. [ENEA, FIS-LAS, Via. E. Fermi 45, 00044 Frascati (RM) (Italy)

    2009-10-15

    With the aim of realizing a compact instrument for detection of energetic materials at trace levels, laser induced breakdown spectroscopy was applied on residues from nine explosives in air surroundings. Different potentially interfering organic materials were also analyzed. The residues were not uniformly distributed on an aluminum support and single-shot discrimination was attempted. For a single residue type, large shot-to-shot fluctuations of the line intensity ratios characteristic for organic samples were observed, which made material classification difficult. It was found that both atomic and molecular emission intensities, as well as their ratios, are strongly affected by an amount of the ablated support material, which mainly determines the plasma temperature. With respect to the spectra from the clean support, emission intensities of atomic oxygen and nitrogen are always reduced in the presence of an organic material, even if its molecules contain these elements. This was attributed to chemical reactions in a plasma containing carbon or its fragments. Hydrogen atomic emission depends strongly on the local humidity above the sampled point and its line intensity shows shot to shot variations up to 50%, also on a homogeneous sample. It is argued that shock waves generated by previous spatially and/or temporally close laser pulses blow away a relatively heavy water aerosol, which later diffuses slowly back towards the sampled point. C{sub 2} and CN exhibit a peak emission behavior with atomic Al emission, and their variable ratio indicates an existence of different formation or removal mechanisms from the plasma, depending on the plasma parameters and on the composition of the organic residue. On the basis of these observations, an attempt is made to establish a suitable procedure for data analysis and to determine the optimal experimental conditions, which would allow for discrimination of explosives from other, potentially interfering, residues.

  12. Nonparametric decision tree: The impact of ISO 9000 on certified and non certified companies Nonparametric decision tree: The impact of ISO 9000 on certified and non certified companies Nonparametric decision tree: The impact of ISO 9000 on certified and non certified companies

    Directory of Open Access Journals (Sweden)

    Joaquín Texeira Quirós

    2013-09-01

    Full Text Available Purpose: This empirical study analyzes a questionnaire answered by a sample of ISO 9000 certified companies and a control sample of companies which have not been certified, using a multivariate predictive model. With this approach, we assess which quality practices are associated to the likelihood of the firm being certified. Design/methodology/approach: We implemented nonparametric decision trees, in order to see which variables influence more the fact that the company be certified or not, i.e., the motivations that lead companies to make sure. Findings: The results show that only four questionnaire items are sufficient to predict if a firm is certified or not. It is shown that companies in which the respondent manifests greater concern with respect to customers relations; motivations of the employees and strategic planning have higher likelihood of being certified. Research implications: the reader should note that this study is based on data from a single country and, of course, these results capture many idiosyncrasies if its economic and corporate environment. It would be of interest to understand if this type of analysis reveals some regularities across different countries. Practical implications: companies should look for a set of practices congruent with total quality management and ISO 9000 certified. Originality/value: This study contributes to the literature on the internal motivation of companies to achieve certification under the ISO 9000 standard, by performing a comparative analysis of questionnaires answered by a sample of certified companies and a control sample of companies which have not been certified. In particular, we assess how the manager’s perception on the intensity in which quality practices are deployed in their firms is associated to the likelihood of the firm being certified.Purpose: This empirical study analyzes a questionnaire answered by a sample of ISO 9000 certified companies and a control sample of companies

  13. Hadron energy reconstruction for the ATLAS calorimetry in the framework of the nonparametrical method

    CERN Document Server

    Akhmadaliev, S Z; Ambrosini, G; Amorim, A; Anderson, K; Andrieux, M L; Aubert, Bernard; Augé, E; Badaud, F; Baisin, L; Barreiro, F; Battistoni, G; Bazan, A; Bazizi, K; Belymam, A; Benchekroun, D; Berglund, S R; Berset, J C; Blanchot, G; Bogush, A A; Bohm, C; Boldea, V; Bonivento, W; Bosman, M; Bouhemaid, N; Breton, D; Brette, P; Bromberg, C; Budagov, Yu A; Burdin, S V; Calôba, L P; Camarena, F; Camin, D V; Canton, B; Caprini, M; Carvalho, J; Casado, M P; Castillo, M V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Chadelas, R; Chalifour, M; Chekhtman, A; Chevalley, J L; Chirikov-Zorin, I E; Chlachidze, G; Citterio, M; Cleland, W E; Clément, C; Cobal, M; Cogswell, F; Colas, Jacques; Collot, J; Cologna, S; Constantinescu, S; Costa, G; Costanzo, D; Crouau, M; Daudon, F; David, J; David, M; Davidek, T; Dawson, J; De, K; de La Taille, C; Del Peso, J; Del Prete, T; de Saintignon, P; Di Girolamo, B; Dinkespiler, B; Dita, S; Dodd, J; Dolejsi, J; Dolezal, Z; Downing, R; Dugne, J J; Dzahini, D; Efthymiopoulos, I; Errede, D; Errede, S; Evans, H; Eynard, G; Fassi, F; Fassnacht, P; Ferrari, A; Ferrer, A; Flaminio, Vincenzo; Fournier, D; Fumagalli, G; Gallas, E; Gaspar, M; Giakoumopoulou, V; Gianotti, F; Gildemeister, O; Giokaris, N; Glagolev, V; Glebov, V Yu; Gomes, A; González, V; González de la Hoz, S; Grabskii, V; Graugès-Pous, E; Grenier, P; Hakopian, H H; Haney, M; Hébrard, C; Henriques, A; Hervás, L; Higón, E; Holmgren, Sven Olof; Hostachy, J Y; Hoummada, A; Huston, J; Imbault, D; Ivanyushenkov, Yu M; Jézéquel, S; Johansson, E K; Jon-And, K; Jones, R; Juste, A; Kakurin, S; Karyukhin, A N; Khokhlov, Yu A; Khubua, J I; Klioukhine, V I; Kolachev, G M; Kopikov, S V; Kostrikov, M E; Kozlov, V; Krivkova, P; Kukhtin, V V; Kulagin, M; Kulchitskii, Yu A; Kuzmin, M V; Labarga, L; Laborie, G; Lacour, D; Laforge, B; Lami, S; Lapin, V; Le Dortz, O; Lefebvre, M; Le Flour, T; Leitner, R; Leltchouk, M; Li, J; Liablin, M V; Linossier, O; Lissauer, D; Lobkowicz, F; Lokajícek, M; Lomakin, Yu F; López-Amengual, J M; Lund-Jensen, B; Maio, A; Makowiecki, D S; Malyukov, S N; Mandelli, L; Mansoulié, B; Mapelli, Livio P; Marin, C P; Marrocchesi, P S; Marroquim, F; Martin, P; Maslennikov, A L; Massol, N; Mataix, L; Mazzanti, M; Mazzoni, E; Merritt, F S; Michel, B; Miller, R; Minashvili, I A; Miralles, L; Mnatzakanian, E A; Monnier, E; Montarou, G; Mornacchi, Giuseppe; Moynot, M; Muanza, G S; Nayman, P; Némécek, S; Nessi, Marzio; Nicoleau, S; Niculescu, M; Noppe, J M; Onofre, A; Pallin, D; Pantea, D; Paoletti, R; Park, I C; Parrour, G; Parsons, J; Pereira, A; Perini, L; Perlas, J A; Perrodo, P; Pilcher, J E; Pinhão, J; Plothow-Besch, Hartmute; Poggioli, Luc; Poirot, S; Price, L; Protopopov, Yu; Proudfoot, J; Puzo, P; Radeka, V; Rahm, David Charles; Reinmuth, G; Renzoni, G; Rescia, S; Resconi, S; Richards, R; Richer, J P; Roda, C; Rodier, S; Roldán, J; Romance, J B; Romanov, V; Romero, P; Rossel, F; Rusakovitch, N A; Sala, P; Sanchis, E; Sanders, H; Santoni, C; Santos, J; Sauvage, D; Sauvage, G; Sawyer, L; Says, L P; Schaffer, A C; Schwemling, P; Schwindling, J; Seguin-Moreau, N; Seidl, W; Seixas, J M; Selldén, B; Seman, M; Semenov, A; Serin, L; Shaldaev, E; Shochet, M J; Sidorov, V; Silva, J; Simaitis, V J; Simion, S; Sissakian, A N; Snopkov, R; Söderqvist, J; Solodkov, A A; Soloviev, A; Soloviev, I V; Sonderegger, P; Soustruznik, K; Spanó, F; Spiwoks, R; Stanek, R; Starchenko, E A; Stavina, P; Stephens, R; Suk, M; Surkov, A; Sykora, I; Takai, H; Tang, F; Tardell, S; Tartarelli, F; Tas, P; Teiger, J; Thaler, J; Thion, J; Tikhonov, Yu A; Tisserant, S; Tokar, S; Topilin, N D; Trka, Z; Turcotte, M; Valkár, S; Varanda, M J; Vartapetian, A H; Vazeille, F; Vichou, I; Vinogradov, V; Vorozhtsov, S B; Vuillemin, V; White, A; Wielers, M; Wingerter-Seez, I; Wolters, H; Yamdagni, N; Yosef, C; Zaitsev, A; Zitoun, R; Zolnierowski, Y

    2002-01-01

    This paper discusses hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter (consisting of a lead-liquid argon electromagnetic part and an iron-scintillator hadronic part) in the framework of the nonparametrical method. The nonparametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to an easy use in a first level trigger. The reconstructed mean values of the hadron energies are within +or-1% of the true values and the fractional energy resolution is [(58+or-3)%/ square root E+(2.5+or-0.3)%](+)(1.7+or-0.2)/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74+or-0.04 and agrees with the prediction that e/h >1.66 for this electromagnetic calorimeter. Results of a study of the longitudinal hadronic shower development are also presented. The data have been taken in the H8 beam...

  14. Finite element analysis for prediction of the residual stresses induced by shot peening II

    International Nuclear Information System (INIS)

    Kim, Cheol; Seok, Chang Sung; Yang, Won Ho; Ryu, Myung Hai

    2002-01-01

    Shot peening is a surface impact treatment widely used to improve the performance of metal parts and welded details subjected to fatigue loading, contact fatigue, stress corrosion and other damage mechanisms. The better performance of the peened parts is mainly due to the residual stresses resulting from the plastic deformation of the surface layers of the material caused by the impact of the shot. In this paper the simulation technique is applied to predict the magnitude and distribution of the residual stress and plastic deformation caused by shot peening with the help of finite element analysis

  15. Multiple co-clustering based on nonparametric mixture models with heterogeneous marginal distributions.

    Science.gov (United States)

    Tokuda, Tomoki; Yoshimoto, Junichiro; Shimizu, Yu; Okada, Go; Takamura, Masahiro; Okamoto, Yasumasa; Yamawaki, Shigeto; Doya, Kenji

    2017-01-01

    We propose a novel method for multiple clustering, which is useful for analysis of high-dimensional data containing heterogeneous types of features. Our method is based on nonparametric Bayesian mixture models in which features are automatically partitioned (into views) for each clustering solution. This feature partition works as feature selection for a particular clustering solution, which screens out irrelevant features. To make our method applicable to high-dimensional data, a co-clustering structure is newly introduced for each view. Further, the outstanding novelty of our method is that we simultaneously model different distribution families, such as Gaussian, Poisson, and multinomial distributions in each cluster block, which widens areas of application to real data. We apply the proposed method to synthetic and real data, and show that our method outperforms other multiple clustering methods both in recovering true cluster structures and in computation time. Finally, we apply our method to a depression dataset with no true cluster structure available, from which useful inferences are drawn about possible clustering structures of the data.

  16. Multiple co-clustering based on nonparametric mixture models with heterogeneous marginal distributions.

    Directory of Open Access Journals (Sweden)

    Tomoki Tokuda

    Full Text Available We propose a novel method for multiple clustering, which is useful for analysis of high-dimensional data containing heterogeneous types of features. Our method is based on nonparametric Bayesian mixture models in which features are automatically partitioned (into views for each clustering solution. This feature partition works as feature selection for a particular clustering solution, which screens out irrelevant features. To make our method applicable to high-dimensional data, a co-clustering structure is newly introduced for each view. Further, the outstanding novelty of our method is that we simultaneously model different distribution families, such as Gaussian, Poisson, and multinomial distributions in each cluster block, which widens areas of application to real data. We apply the proposed method to synthetic and real data, and show that our method outperforms other multiple clustering methods both in recovering true cluster structures and in computation time. Finally, we apply our method to a depression dataset with no true cluster structure available, from which useful inferences are drawn about possible clustering structures of the data.

  17. Multiple co-clustering based on nonparametric mixture models with heterogeneous marginal distributions

    Science.gov (United States)

    Yoshimoto, Junichiro; Shimizu, Yu; Okada, Go; Takamura, Masahiro; Okamoto, Yasumasa; Yamawaki, Shigeto; Doya, Kenji

    2017-01-01

    We propose a novel method for multiple clustering, which is useful for analysis of high-dimensional data containing heterogeneous types of features. Our method is based on nonparametric Bayesian mixture models in which features are automatically partitioned (into views) for each clustering solution. This feature partition works as feature selection for a particular clustering solution, which screens out irrelevant features. To make our method applicable to high-dimensional data, a co-clustering structure is newly introduced for each view. Further, the outstanding novelty of our method is that we simultaneously model different distribution families, such as Gaussian, Poisson, and multinomial distributions in each cluster block, which widens areas of application to real data. We apply the proposed method to synthetic and real data, and show that our method outperforms other multiple clustering methods both in recovering true cluster structures and in computation time. Finally, we apply our method to a depression dataset with no true cluster structure available, from which useful inferences are drawn about possible clustering structures of the data. PMID:29049392

  18. Cliff´s Delta Calculator: A non-parametric effect size program for two groups of observations

    Directory of Open Access Journals (Sweden)

    Guillermo Macbeth

    2011-05-01

    Full Text Available The Cliff´s Delta statistic is an effect size measure that quantifies the amount of difference between two non-parametric variables beyond p-values interpretation. This measure can be understood as a useful complementary analysis for the corresponding hypothesis testing. During the last two decades the use of effect size measures has been strongly encouraged by methodologists and leading institutions of behavioral sciences. The aim of this contribution is to introduce the Cliff´s Delta Calculator software that performs such analysis and offers some interpretation tips. Differences and similarities with the parametric case are analysed and illustrated. The implementation of this free program is fully described and compared with other calculators. Alternative algorithmic approaches are mathematically analysed and a basic linear algebra proof of its equivalence is formally presented. Two worked examples in cognitive psychology are commented. A visual interpretation of Cliff´s Delta is suggested. Availability, installation and applications of the program are presented and discussed.

  19. Financial cost-benefit analysis of investment possibilities in district heating system on wood residues

    Directory of Open Access Journals (Sweden)

    Stošić Ivan

    2017-01-01

    Full Text Available The purpose of this research is to provide feasibility analysis of a long-term sustainable development concept for district heating based on wood residues. In this paper, the experimental study has been conducted starting from the data collected by field researches in municipality of Trstenik (town in Serbia with district heating system currently based on heavy fuel oil and lignite. Using the method of Financial Cost-Benefit Analysis, this study evaluates financial efficiency of investment in district heating plant based on wood residues and energy savings in district heating system. Findings show that such investment could be profitable from the financial point of view: Net Present Value of investment is positive, Financial Rate of Return is high (30.69%, and the pay-back period is relatively favourable (7 years. Moreover, the presented SWOT indicates that there are realistic prospects of implementation of district heating based on wood residues. However, this does not mean everything will go smoothly and easily, keeping in mind a number of challenges that each new concept of district heating contains immanently. Nevertheless, the results of this research could provide useful inputs for the decision makers when selecting appropriate models for improving performance of municipal district heating systems.

  20. Two-component mixture cure rate model with spline estimated nonparametric components.

    Science.gov (United States)

    Wang, Lu; Du, Pang; Liang, Hua

    2012-09-01

    In some survival analysis of medical studies, there are often long-term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two-component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time-covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation-maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point-wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study. © 2011, The International Biometric Society.

  1. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  2. Liquid paraffin as new dilution medium for the analysis of high boiling point residual solvents with static headspace-gas chromatography.

    Science.gov (United States)

    D'Autry, Ward; Zheng, Chao; Bugalama, John; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Wang, Bochu; Van Schepdael, Ann

    2011-07-15

    Residual solvents are volatile organic compounds which can be present in pharmaceutical substances. A generic static headspace-gas chromatography analysis method for the identification and control of residual solvents is described in the European Pharmacopoeia. Although this method is proved to be suitable for the majority of samples and residual solvents, the method may lack sensitivity for high boiling point residual solvents such as N,N-dimethylformamide, N,N-dimethylacetamide, dimethyl sulfoxide and benzyl alcohol. In this study, liquid paraffin was investigated as new dilution medium for the analysis of these residual solvents. The headspace-gas chromatography method was developed and optimized taking the official Pharmacopoeia method as a starting point. The optimized method was validated according to ICH criteria. It was found that the detection limits were below 1μg/vial for each compound, indicating a drastically increased sensitivity compared to the Pharmacopoeia method, which failed to detect the compounds at their respective limit concentrations. Linearity was evaluated based on the R(2) values, which were above 0.997 for all compounds, and inspection of residual plots. Instrument and method precision were examined by calculating the relative standard deviations (RSD) of repeated analyses within the linearity and accuracy experiments, respectively. It was found that all RSD values were below 10%. Accuracy was checked by a recovery experiment at three different levels. Mean recovery values were all in the range 95-105%. Finally, the optimized method was applied to residual DMSO analysis in four different Kollicoat(®) sample batches. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Residual stress analysis in linear friction welded in-service Inconel 718 superalloy via neutron diffraction and contour method approaches

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M. [University of British Columbia – Okanagan, School of Engineering, 3333 University Way, Kelowna, Canada V1V 1V7 (Canada); Levesque, J.-B. [Institut de recherche d' Hydro-Québec (IREQ), 1800 Lionel-Boulet Blvd., Varennes, Canada J3X 1S1 (Canada); Bichler, L., E-mail: lukas.bichler@ubc.ca [University of British Columbia – Okanagan, School of Engineering, 3333 University Way, Kelowna, Canada V1V 1V7 (Canada); Sediako, D. [Canadian Nuclear Laboratories, Building 459, Station 18, Chalk River, Canada K0J 1J0 (Canada); Gholipour, J.; Wanjara, P. [National Research Council of Canada, Aerospace 5145 Decelles Ave., Montreal, Canada H3T 2B2 (Canada)

    2017-04-13

    In this study, an analysis of elastic residual stress in Inconel{sup ®} 718 (IN 718) linear friction welds (LFWs) was carried out. In particular, the suitability of LFW for manufacturing and repair of aero engine components was emulated by joining virgin and in-service (extracted from a turbine disk) materials. The evolution in the residual strains and stresses in the heat-affected zone (HAZ), thermomechanically affected zone (TMAZ) and dynamically recrystallized zone (DRX) of the weld was characterized using the neutron diffraction and contour methods. The results provided insight into diverse challenges in quantitative analysis of residual stresses in welded IN 718 using diffraction techniques. Specifically, judicious selection of the beam width, height and stress-free lattice spacing were seen to be crucial to minimize measurement error and increase accuracy. Further, the contour method – a destructive technique relying on capturing the stress relaxation after electrical discharge machining – was used to characterize the residual stress distribution on two-dimensional plane sections of the welds. Both techniques suggested an increasing magnitude of residual stress originating from the base metal that reached a peak at the weld interface. Both methods indicated that the peak magnitude of residual stresses were below the yield stress of IN 718.

  4. Bias due to two-stage residual-outcome regression analysis in genetic association studies.

    Science.gov (United States)

    Demissie, Serkalem; Cupples, L Adrienne

    2011-11-01

    Association studies of risk factors and complex diseases require careful assessment of potential confounding factors. Two-stage regression analysis, sometimes referred to as residual- or adjusted-outcome analysis, has been increasingly used in association studies of single nucleotide polymorphisms (SNPs) and quantitative traits. In this analysis, first, a residual-outcome is calculated from a regression of the outcome variable on covariates and then the relationship between the adjusted-outcome and the SNP is evaluated by a simple linear regression of the adjusted-outcome on the SNP. In this article, we examine the performance of this two-stage analysis as compared with multiple linear regression (MLR) analysis. Our findings show that when a SNP and a covariate are correlated, the two-stage approach results in biased genotypic effect and loss of power. Bias is always toward the null and increases with the squared-correlation between the SNP and the covariate (). For example, for , 0.1, and 0.5, two-stage analysis results in, respectively, 0, 10, and 50% attenuation in the SNP effect. As expected, MLR was always unbiased. Since individual SNPs often show little or no correlation with covariates, a two-stage analysis is expected to perform as well as MLR in many genetic studies; however, it produces considerably different results from MLR and may lead to incorrect conclusions when independent variables are highly correlated. While a useful alternative to MLR under , the two -stage approach has serious limitations. Its use as a simple substitute for MLR should be avoided. © 2011 Wiley Periodicals, Inc.

  5. Analysis of martensitic transformation and residual tension in an 304L stainless steel

    International Nuclear Information System (INIS)

    Alves, Juciane Maria

    2014-01-01

    The relationship between plastic deformation and the strain induced phase transformation, that provides a practical route to the development of new engineering materials with excellent mechanical properties, characterize the TRIP effect 'Transformation Induced Plasticity'. Among the stainless steels, the metastable 304 L austenitic steel is susceptible to transformation of austenite-martensite phase from tensile tests at room temperature by increments of plastic deformation. It is of great technological and scientific interest the knowledge of the evolution of phase transformation and residual stress from different levels and rates of plastic deformation imposed to the material. It is also important to evaluate the interference of metallographic preparation in quantitative analyzes of this steel. The main techniques used in this study consisted of X-rays diffraction and Ferritoscopy for the quantitation phase, and XRD to residual stress analysis also. As observed, the phase transformation quantification has not suffered significant influence of the metallographic preparation and evolved from increments of plastic deformation due to different stop charges and strain rates, leading to a further strengthening of the austenite matrix. The evaluation of residual stress resulting from the martensitic transformation was susceptible to the metallographic preparation and increased its value on comparison to sample without metallographic preparation. It was also observed that the residual stress decreased with the increase of the fraction of transformed martensite. (author)

  6. Non-parametric transformation for data correlation and integration: From theory to practice

    Energy Technology Data Exchange (ETDEWEB)

    Datta-Gupta, A.; Xue, Guoping; Lee, Sang Heon [Texas A& M Univ., College Station, TX (United States)

    1997-08-01

    The purpose of this paper is two-fold. First, we introduce the use of non-parametric transformations for correlating petrophysical data during reservoir characterization. Such transformations are completely data driven and do not require a priori functional relationship between response and predictor variables which is the case with traditional multiple regression. The transformations are very general, computationally efficient and can easily handle mixed data types for example, continuous variables such as porosity, permeability and categorical variables such as rock type, lithofacies. The power of the non-parametric transformation techniques for data correlation has been illustrated through synthetic and field examples. Second, we utilize these transformations to propose a two-stage approach for data integration during heterogeneity characterization. The principal advantages of our approach over traditional cokriging or cosimulation methods are: (1) it does not require a linear relationship between primary and secondary data, (2) it exploits the secondary information to its fullest potential by maximizing the correlation between the primary and secondary data, (3) it can be easily applied to cases where several types of secondary or soft data are involved, and (4) it significantly reduces variance function calculations and thus, greatly facilitates non-Gaussian cosimulation. We demonstrate the data integration procedure using synthetic and field examples. The field example involves estimation of pore-footage distribution using well data and multiple seismic attributes.

  7. Reciprocally coupled residues crucial for protein kinase Pak2 activity calculated by statistical coupling analysis.

    Directory of Open Access Journals (Sweden)

    Yuan-Hao Hsu

    2010-03-01

    Full Text Available Regulation of Pak2 activity involves at least two mechanisms: (i phosphorylation of the conserved Thr(402 in the activation loop and (ii interaction of the autoinhibitory domain (AID with the catalytic domain. We collected 482 human protein kinase sequences from the kinome database and globally mapped the evolutionary interactions of the residues in the catalytic domain with Thr(402 by sequence-based statistical coupling analysis (SCA. Perturbation of Thr(402 (34.6% suggests a communication pathway between Thr(402 in the activation loop, and Phe(387 (DeltaDeltaE(387F,402T = 2.80 in the magnesium positioning loop, Trp(427 (DeltaDeltaE(427W,402T = 3.12 in the F-helix, and Val(404 (DeltaDeltaE(404V,402T = 4.43 and Gly(405 (DeltaDeltaE(405G,402T = 2.95 in the peptide positioning loop. When compared to the cAMP-dependent protein kinase (PKA and Src, the perturbation pattern of threonine phosphorylation in the activation loop of Pak2 is similar to that of PKA, and different from the tyrosine phosphorylation pattern of Src. Reciprocal coupling analysis by SCA showed the residues perturbed by Thr(402 and the reciprocal coupling pairs formed a network centered at Trp(427 in the F-helix. Nine pairs of reciprocal coupling residues crucial for enzymatic activity and structural stabilization were identified. Pak2, PKA and Src share four pairs. Reciprocal coupling residues exposed to the solvent line up as an activation groove. This is the inhibitor (PKI binding region in PKA and the activation groove for Pak2. This indicates these evolutionary conserved residues are crucial for the catalytic activity of PKA and Pak2.

  8. Experimental stress analysis for determination of residual stresses and integrity monitoring of components and systems

    International Nuclear Information System (INIS)

    1993-01-01

    For an analysis of the safety-related significance of residual stresses, mechanical, magnetic as well as ultrasonic and diffraction methods can be applied as testing methods. The results of an interlaboratory test concerning the experimental determination of residual stresses in a railway track are included. Further, questions are analyzed concerning the in-service inspections of components and systems with regard to their operational safety and life. Measurement methods are explained by examples from power plant engineering, nuclear power plant engineering, construction and traffic engineering as well as aeronautics. (DG) [de

  9. Chemometric classification of gunshot residues based on energy dispersive X-ray microanalysis and inductively coupled plasma analysis with mass-spectrometric detection

    International Nuclear Information System (INIS)

    Steffen, S.; Otto, M.; Niewoehner, L.; Barth, M.; Brozek-Mucha, Z.; Biegstraaten, J.; Horvath, R.

    2007-01-01

    A gunshot residue sample that was collected from an object or a suspected person is automatically searched for gunshot residue relevant particles. Particle data (such as size, morphology, position on the sample for manual relocation, etc.) as well as the corresponding X-ray spectra and images are stored. According to these data, particles are classified by the analysis-software into different groups: 'gunshot residue characteristic', 'consistent with gunshot residue' and environmental particles, respectively. Potential gunshot residue particles are manually checked and - if necessary - confirmed by the operating forensic scientist. As there are continuing developments on the ammunition market worldwide, it becomes more and more difficult to assign a detected particle to a particular ammunition brand. As well, the differentiation towards environmental particles similar to gunshot residue is getting more complex. To keep external conditions unchanged, gunshot residue particles were collected using a specially designed shooting device for the test shots revealing defined shooting distances between the weapon's muzzle and the target. The data obtained as X-ray spectra of a number of particles (3000 per ammunition brand) were reduced by Fast Fourier Transformation and subjected to a chemometric evaluation by means of regularized discriminant analysis. In addition to the scanning electron microscopy in combination with energy dispersive X-ray microanalysis results, isotope ratio measurements based on inductively coupled plasma analysis with mass-spectrometric detection were carried out to provide a supplementary feature for an even lower risk of misclassification

  10. Chemometric classification of gunshot residues based on energy dispersive X-ray microanalysis and inductively coupled plasma analysis with mass-spectrometric detection

    Energy Technology Data Exchange (ETDEWEB)

    Steffen, S. [Bundeskriminalamt (BKA), Forensic Science Institute KT23, Thaerstr. 11, D - 65193 Wiesbaden (Germany); Otto, M. [TU Bergakademie Freiberg (TU BAF), Institute for Analytical Chemistry, Leipziger Str. 29, D - 09599 Freiberg (Germany)], E-mail: matthias.otto@chemie.tu-freiberg.de; Niewoehner, L.; Barth, M. [Bundeskriminalamt (BKA), Forensic Science Institute KT23, Thaerstr. 11, D - 65193 Wiesbaden (Germany); Brozek-Mucha, Z. [Instytut Ekspertyz Sadowych (IES), Westerplatte St. 9, PL - 31-033 Krakow (Poland); Biegstraaten, J. [Nederlands Forensisch Instituut (NFI), Fysische Technologie, Laan van Ypenburg 6, NL-2497 GB Den Haag (Netherlands); Horvath, R. [Kriminalisticky a Expertizny Ustav (KEU PZ), Institute of Forensic Science, Sklabinska 1, SK - 812 72 Bratislava (Slovakia)

    2007-09-15

    A gunshot residue sample that was collected from an object or a suspected person is automatically searched for gunshot residue relevant particles. Particle data (such as size, morphology, position on the sample for manual relocation, etc.) as well as the corresponding X-ray spectra and images are stored. According to these data, particles are classified by the analysis-software into different groups: 'gunshot residue characteristic', 'consistent with gunshot residue' and environmental particles, respectively. Potential gunshot residue particles are manually checked and - if necessary - confirmed by the operating forensic scientist. As there are continuing developments on the ammunition market worldwide, it becomes more and more difficult to assign a detected particle to a particular ammunition brand. As well, the differentiation towards environmental particles similar to gunshot residue is getting more complex. To keep external conditions unchanged, gunshot residue particles were collected using a specially designed shooting device for the test shots revealing defined shooting distances between the weapon's muzzle and the target. The data obtained as X-ray spectra of a number of particles (3000 per ammunition brand) were reduced by Fast Fourier Transformation and subjected to a chemometric evaluation by means of regularized discriminant analysis. In addition to the scanning electron microscopy in combination with energy dispersive X-ray microanalysis results, isotope ratio measurements based on inductively coupled plasma analysis with mass-spectrometric detection were carried out to provide a supplementary feature for an even lower risk of misclassification.

  11. A spatio-temporal nonparametric Bayesian variable selection model of fMRI data for clustering correlated time courses.

    Science.gov (United States)

    Zhang, Linlin; Guindani, Michele; Versace, Francesco; Vannucci, Marina

    2014-07-15

    In this paper we present a novel wavelet-based Bayesian nonparametric regression model for the analysis of functional magnetic resonance imaging (fMRI) data. Our goal is to provide a joint analytical framework that allows to detect regions of the brain which exhibit neuronal activity in response to a stimulus and, simultaneously, infer the association, or clustering, of spatially remote voxels that exhibit fMRI time series with similar characteristics. We start by modeling the data with a hemodynamic response function (HRF) with a voxel-dependent shape parameter. We detect regions of the brain activated in response to a given stimulus by using mixture priors with a spike at zero on the coefficients of the regression model. We account for the complex spatial correlation structure of the brain by using a Markov random field (MRF) prior on the parameters guiding the selection of the activated voxels, therefore capturing correlation among nearby voxels. In order to infer association of the voxel time courses, we assume correlated errors, in particular long memory, and exploit the whitening properties of discrete wavelet transforms. Furthermore, we achieve clustering of the voxels by imposing a Dirichlet process (DP) prior on the parameters of the long memory process. For inference, we use Markov Chain Monte Carlo (MCMC) sampling techniques that combine Metropolis-Hastings schemes employed in Bayesian variable selection with sampling algorithms for nonparametric DP models. We explore the performance of the proposed model on simulated data, with both block- and event-related design, and on real fMRI data. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. The interaction of fatigue cracks with a residual stress field using thermoelastic stress analysis and synchrotron X-ray diffraction experiments

    Science.gov (United States)

    Amjad, Khurram; Asquith, David; Sebastian, Christopher M.; Wang, Wei-Chung

    2017-01-01

    This article presents an experimental study on the fatigue behaviour of cracks emanating from cold-expanded holes utilizing thermoelastic stress analysis (TSA) and synchrotron X-ray diffraction (SXRD) techniques with the aim of resolving the long-standing ambiguity in the literature regarding potential relaxation, or modification, of beneficial compressive residual stresses as a result of fatigue crack propagation. The crack growth rates are found to be substantially lower as the crack tip moved through the residual stress zone induced by cold expansion. The TSA results demonstrated that the crack tip plastic zones were reduced in size by the presence of the residual compressive stresses induced by cold expansion. The crack tip plastic zones were found to be insignificant in size in comparison to the residual stress zone resulting from cold expansion, which implied that they were unlikely to have had a notable impact on the surrounding residual stresses induced by cold expansion. The residual stress distributions measured along the direction of crack growth, using SXRD, showed no signs of any significant stress relaxation or redistribution, which validates the conclusions drawn from the TSA data. Fractographic analysis qualitatively confirmed the influence on crack initiation of the residual stresses induced by the cold expansion. It was found that the application of single compressive overload caused a relaxation, or reduction in the residual stresses, which has wider implications for improving the fatigue life. PMID:29291095

  13. Reactivity of Athabasca residue and of its SARA fractions during residue hydroconversion

    Energy Technology Data Exchange (ETDEWEB)

    Verstraete, J.; Danial-Fortain, P.; Gauthier, T.; Merdrignac, I. [IFP-Lyon, Vermaison (France); Budzinski, H. [Bordeaux Univ. (France). ISM-LPTC, UMR CNRS

    2009-07-01

    Residue conversion processes are becoming increasingly important because of the declining market for residual fuel oil and a greater demand for middle distillates. Ebullated-bed hydroconversion is a commercially proven technology for converting heavy feedstocks with high amounts of impurities. The process enables the conversion of atmospheric or vacuum residues at temperatures up to 440 degrees C, and at liquid hourly space velocity (LHSV) conditions in the range of 0.15 to 0.5 per hour. A 540 degrees C conversion of up to 80 weight per cent can be achieved under these conditions. This paper reported on a research study conducted at IFP Lyon in which the residue hydroconversion in a large-scale ebullated bed bench unit was investigated to determine the impact of operating conditions and feed properties on yield and product qualities. Hydrogen was added to the feed in the bench units to keep a high hydrogen partial pressure and favour the catalytic hydroconversion reactions. In a typical test, the reactor was fed with 50 g of feedstock and 0.45 g of crushed equilibrium industrial NiMo catalyst, pressurized hydrogen and quickly heated at the reaction temperature. This paper also discussed the conversion of Athabasca bitumen residue in the large-scale pilot plant and also in the small scale batch reactor. The effect of operating temperature and space velocity was examined. The reactivity of the saturates, aromatics, resins and asphaltenes (SARA) fractions of the bitumen was studied separately in order to better understand the conversion mechanisms and reactivities. The Athabasca bitumen feed and SARA fractions were also analyzed in terms of standard petroleum analysis, SARA fractionation, elemental analysis, size exclusion chromatography (SEC) and 13C NMR. Hydroconversion experiments were conducted in the batch unit at different reaction temperatures and reaction times. A comparison of small-scale batch results with those obtained with the continuous large-scale bench

  14. Obtention of ceramic pigments with residue from electroplating

    International Nuclear Information System (INIS)

    Boss, A.; Kniess, C.T.; Aguiar, B.M. de; Prates, P.B.; Milanez, K.

    2011-01-01

    The incorporation of industrial residues in industrial processes opens up new business opportunities and reduces the volume of extraction of raw materials, preserving natural resources, which are limited. An important residue is the mud from galvanic industry, consisting of alkali and transition metals. According to NBR 10004/2004, this residue can be classified as Class I (hazardous), depending on the concentration of metals present in the mud. This paper proposes a method for reusing the residue from electroplating in ceramic pigments. The characterization of residual plating was obtained by chemical analysis, mineralogical analysis and pH measurements. The electroplating waste was incorporated in different percentages on a standard pigment formula of industrial ceramic, consisting mainly of Zn, Fe and Cr. The obtained pigments were applied in ceramic glazes to colorimetric and visual analysis, which showed good results with the addition of up to 15% of industrial waste. (author)

  15. Forensic Analysis of High Explosive Residues from Selected Cloth

    International Nuclear Information System (INIS)

    Mohamad Afiq Mohamed Huri; Umi Kalthom Ahmad

    2014-01-01

    Increased terrorist activities around the Asian region have resulted in the need for improved analytical techniques in forensic analysis. High explosive residues from post-blast clothing are often encountered as physical evidence submitted to a forensic laboratory. Therefore, this study was initiated to detect high explosives residues of cyclotrimethylenetrinitramine (RDX) and pentaerythritol tetranitrate (PETN) on selected cloth in this study. Cotton swabbing technique was employed as a simple and rapid method in recovering analytes from the sample matrix. Analytes were analyzed using Griess spot test, TLC and HPLC. TLC separation employed toluene-ethyl acetate (9:1) as a good solvent system. Reversed phase HPLC separation employed acetonitrile-water (65:35) as the mobile phase and analytes detected using a programmed wavelength. RDX was detected at 235 nm for the first 3.5 min and then switched to 215 nm for PETN. Limits of detection (LODs) of analytes were in the low ppm range (0.05 ppm for RDX and 0.25 ppm for PETN). Analyte recovery studies revealed that the type of cloth has a profound effect on the extraction efficiency. Analytes were recovered better for nylon as compared to cotton cloth. However, no analytes could be recovered from denim cloth. For post-blast samples, only RDX was detected in low concentration for both nylon and cotton cloth. (author)

  16. Nonparametric Identification of Glucose-Insulin Process in IDDM Patient with Multi-meal Disturbance

    Science.gov (United States)

    Bhattacharjee, A.; Sutradhar, A.

    2012-12-01

    Modern close loop control for blood glucose level in a diabetic patient necessarily uses an explicit model of the process. A fixed parameter full order or reduced order model does not characterize the inter-patient and intra-patient parameter variability. This paper deals with a frequency domain nonparametric identification of the nonlinear glucose-insulin process in an insulin dependent diabetes mellitus patient that captures the process dynamics in presence of uncertainties and parameter variations. An online frequency domain kernel estimation method has been proposed that uses the input-output data from the 19th order first principle model of the patient in intravenous route. Volterra equations up to second order kernels with extended input vector for a Hammerstein model are solved online by adaptive recursive least square (ARLS) algorithm. The frequency domain kernels are estimated using the harmonic excitation input data sequence from the virtual patient model. A short filter memory length of M = 2 was found sufficient to yield acceptable accuracy with lesser computation time. The nonparametric models are useful for closed loop control, where the frequency domain kernels can be directly used as the transfer function. The validation results show good fit both in frequency and time domain responses with nominal patient as well as with parameter variations.

  17. Evaluation of residue-residue contact prediction in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-08-31

    We present the results of the assessment of the intramolecular residue-residue contact predictions from 26 prediction groups participating in the 10th round of the CASP experiment. The most recently developed direct coupling analysis methods did not take part in the experiment likely because they require a very deep sequence alignment not available for any of the 114 CASP10 targets. The performance of contact prediction methods was evaluated with the measures used in previous CASPs (i.e., prediction accuracy and the difference between the distribution of the predicted contacts and that of all pairs of residues in the target protein), as well as new measures, such as the Matthews correlation coefficient, the area under the precision-recall curve and the ranks of the first correctly and incorrectly predicted contact. We also evaluated the ability to detect interdomain contacts and tested whether the difficulty of predicting contacts depends upon the protein length and the depth of the family sequence alignment. The analyses were carried out on the target domains for which structural homologs did not exist or were difficult to identify. The evaluation was performed for all types of contacts (short, medium, and long-range), with emphasis placed on long-range contacts, i.e. those involving residues separated by at least 24 residues along the sequence. The assessment suggests that the best CASP10 contact prediction methods perform at approximately the same level, and comparably to those participating in CASP9.

  18. Bayesian Analysis for Penalized Spline Regression Using WinBUGS

    Directory of Open Access Journals (Sweden)

    Ciprian M. Crainiceanu

    2005-09-01

    Full Text Available Penalized splines can be viewed as BLUPs in a mixed model framework, which allows the use of mixed model software for smoothing. Thus, software originally developed for Bayesian analysis of mixed models can be used for penalized spline regression. Bayesian inference for nonparametric models enjoys the flexibility of nonparametric models and the exact inference provided by the Bayesian inferential machinery. This paper provides a simple, yet comprehensive, set of programs for the implementation of nonparametric Bayesian analysis in WinBUGS. Good mixing properties of the MCMC chains are obtained by using low-rank thin-plate splines, while simulation times per iteration are reduced employing WinBUGS specific computational tricks.

  19. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  20. Residual energy applications program systems analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Yngve, P.W.

    1980-10-01

    Current DOE plans call for building an Energy Applied Systems Test (EAST) Facility at the Savannah River Plant in close proximity to the 140 to 150/sup 0/F waste heat from one of several operating nuclear reactors. The waste water flow from each reactor, approximately 165,000 gpm, provides a unique opportunity to test the performance and operating characteristics of large-scale waste heat power generation and heat pump system concepts. This report provides a preliminary description of the potential end-use market, parametric data on heat pump and the power generation system technology, a preliminary listing of EAST Facility requirements, and an example of an integrated industrial park utilizing the technology to maximize economic pay back. The parametric heat pump analysis concluded that dual-fluid Rankine cycle heat pumps with capacities as high as 400 x 10/sup 6/ Btu/h, can utilize large sources of low temperature residual heat to provide 300/sup 0/F saturatd steam for an industrial park. The before tax return on investment for this concept is 36.2%. The analysis also concluded that smaller modular heat pumps could fulfill the same objective while sacrificing only a moderate rate of return. The parametric power generation analysis concluded that multi-pressure Rankine cycle systems not only are superior to single pressure systems, but can also be developed for large systems (approx. = 17 MW/sub e/). This same technology is applicable to smaller systems at the sacrifice of higher investment per unit output.

  1. Bayesian nonparametric areal wombling for small-scale maps with an application to urinary bladder cancer data from Connecticut.

    Science.gov (United States)

    Guhaniyogi, Rajarshi

    2017-11-10

    With increasingly abundant spatial data in the form of case counts or rates combined over areal regions (eg, ZIP codes, census tracts, or counties), interest turns to formal identification of difference "boundaries," or barriers on the map, in addition to the estimated statistical map itself. "Boundary" refers to a border that describes vastly disparate outcomes in the adjacent areal units, perhaps caused by latent risk factors. This article focuses on developing a model-based statistical tool, equipped to identify difference boundaries in maps with a small number of areal units, also referred to as small-scale maps. This article proposes a novel and robust nonparametric boundary detection rule based on nonparametric Dirichlet processes, later referred to as Dirichlet process wombling (DPW) rule, by employing Dirichlet process-based mixture models for small-scale maps. Unlike the recently proposed nonparametric boundary detection rules based on false discovery rates, the DPW rule is free of ad hoc parameters, computationally simple, and readily implementable in freely available software for public health practitioners such as JAGS and OpenBUGS and yet provides statistically interpretable boundary detection in small-scale wombling. We offer a detailed simulation study and an application of our proposed approach to a urinary bladder cancer incidence rates dataset between 1990 and 2012 in the 8 counties in Connecticut. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Extending the linear model with R generalized linear, mixed effects and nonparametric regression models

    CERN Document Server

    Faraway, Julian J

    2005-01-01

    Linear models are central to the practice of statistics and form the foundation of a vast range of statistical methodologies. Julian J. Faraway''s critically acclaimed Linear Models with R examined regression and analysis of variance, demonstrated the different methods available, and showed in which situations each one applies. Following in those footsteps, Extending the Linear Model with R surveys the techniques that grow from the regression model, presenting three extensions to that framework: generalized linear models (GLMs), mixed effect models, and nonparametric regression models. The author''s treatment is thoroughly modern and covers topics that include GLM diagnostics, generalized linear mixed models, trees, and even the use of neural networks in statistics. To demonstrate the interplay of theory and practice, throughout the book the author weaves the use of the R software environment to analyze the data of real examples, providing all of the R commands necessary to reproduce the analyses. All of the ...

  3. Nonparametric estimation for censored mixture data with application to the Cooperative Huntington's Observational Research Trial.

    Science.gov (United States)

    Wang, Yuanjia; Garcia, Tanya P; Ma, Yanyuan

    2012-01-01

    This work presents methods for estimating genotype-specific distributions from genetic epidemiology studies where the event times are subject to right censoring, the genotypes are not directly observed, and the data arise from a mixture of scientifically meaningful subpopulations. Examples of such studies include kin-cohort studies and quantitative trait locus (QTL) studies. Current methods for analyzing censored mixture data include two types of nonparametric maximum likelihood estimators (NPMLEs) which do not make parametric assumptions on the genotype-specific density functions. Although both NPMLEs are commonly used, we show that one is inefficient and the other inconsistent. To overcome these deficiencies, we propose three classes of consistent nonparametric estimators which do not assume parametric density models and are easy to implement. They are based on the inverse probability weighting (IPW), augmented IPW (AIPW), and nonparametric imputation (IMP). The AIPW achieves the efficiency bound without additional modeling assumptions. Extensive simulation experiments demonstrate satisfactory performance of these estimators even when the data are heavily censored. We apply these estimators to the Cooperative Huntington's Observational Research Trial (COHORT), and provide age-specific estimates of the effect of mutation in the Huntington gene on mortality using a sample of family members. The close approximation of the estimated non-carrier survival rates to that of the U.S. population indicates small ascertainment bias in the COHORT family sample. Our analyses underscore an elevated risk of death in Huntington gene mutation carriers compared to non-carriers for a wide age range, and suggest that the mutation equally affects survival rates in both genders. The estimated survival rates are useful in genetic counseling for providing guidelines on interpreting the risk of death associated with a positive genetic testing, and in facilitating future subjects at risk

  4. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  5. Analysing the length of care episode after hip fracture: a nonparametric and a parametric Bayesian approach.

    Science.gov (United States)

    Riihimäki, Jaakko; Sund, Reijo; Vehtari, Aki

    2010-06-01

    Effective utilisation of limited resources is a challenge for health care providers. Accurate and relevant information extracted from the length of stay distributions is useful for management purposes. Patient care episodes can be reconstructed from the comprehensive health registers, and in this paper we develop a Bayesian approach to analyse the length of care episode after a fractured hip. We model the large scale data with a flexible nonparametric multilayer perceptron network and with a parametric Weibull mixture model. To assess the performances of the models, we estimate expected utilities using predictive density as a utility measure. Since the model parameters cannot be directly compared, we focus on observables, and estimate the relevances of patient explanatory variables in predicting the length of stay. To demonstrate how the use of the nonparametric flexible model is advantageous for this complex health care data, we also study joint effects of variables in predictions, and visualise nonlinearities and interactions found in the data.

  6. Triangles in ROC space: History and theory of "nonparametric" measures of sensitivity and response bias.

    Science.gov (United States)

    Macmillan, N A; Creelman, C D

    1996-06-01

    Can accuracy and response bias in two-stimulus, two-response recognition or detection experiments be measured nonparametrically? Pollack and Norman (1964) answered this question affirmatively for sensitivity, Hodos (1970) for bias: Both proposed measures based on triangular areas in receiver-operating characteristic space. Their papers, and especially a paper by Grier (1971) that provided computing formulas for the measures, continue to be heavily cited in a wide range of content areas. In our sample of articles, most authors described triangle-based measures as making fewer assumptions than measures associated with detection theory. However, we show that statistics based on products or ratios of right triangle areas, including a recently proposed bias index and a not-yetproposed but apparently plausible sensitivity index, are consistent with a decision process based on logistic distributions. Even the Pollack and Norman measure, which is based on non-right triangles, is approximately logistic for low values of sensitivity. Simple geometric models for sensitivity and bias are not nonparametric, even if their implications are not acknowledged in the defining publications.

  7. Residual nilpotence and residual solubility of groups

    International Nuclear Information System (INIS)

    Mikhailov, R V

    2005-01-01

    The properties of the residual nilpotence and the residual solubility of groups are studied. The main objects under investigation are the class of residually nilpotent groups such that each central extension of these groups is also residually nilpotent and the class of residually soluble groups such that each Abelian extension of these groups is residually soluble. Various examples of groups not belonging to these classes are constructed by homological methods and methods of the theory of modules over group rings. Several applications of the theory under consideration are presented and problems concerning the residual nilpotence of one-relator groups are considered.

  8. Impact-disrupted gunshot residue: A sub-micron analysis using a novel collection protocol

    Directory of Open Access Journals (Sweden)

    V. Spathis

    2017-06-01

    Full Text Available The analysis of gunshot residue (GSR has played an integral role within the legal system in relation to shooting cases. With a characteristic elemental composition of lead, antimony, barium, and a typically discriminative spheroidal morphology, the presence and distribution of GSR can aid in firearm investigations. In this experiment, three shots of low velocity rim-fire ammunition were fired over polished silicon collection substrates placed at six intervals over a 100 cm range. The samples were analysed using a Field Emission Gun Scanning Electron Microscope (FEG-SEM in conjunction with an X-flash Energy Dispersive X-ray (EDX detector, allowing for GSR particle analyses of composition and structure at the sub-micron level. The results of this experiment indicate that although classic spheroidal particles are present consistently throughout the entire range of samples their sizes vary significantly, and at certain distances from the firearm particles with an irregular morphology were discerned, forming “impact-disrupted” GSR particles, henceforth colloquially referred to as “splats”. Upon further analysis, trends with regards to the formation of these splat particles were distinguished. An increase in splat frequency was observed starting at 10 cm from the firearm, with 147 mm−2 splat density, reaching a maximal flux at 40 cm (451 mm−2, followed by a gradual decrease to the maximum range sampled. Moreover, the structural morphology of the splats changes throughout the sampling range. At the distances closest to the firearm, molten-looking particles were formed, demonstrating the metallic residues were in a liquid state when their flight path was disrupted. However, at increased distances-primarily where the discharge plume was at maximum dispersion and moving away from the firearm, the residues have had time to cool in-fight resulting in semi-congealed and solid particles that subsequently disrupted upon impact, forming more

  9. Single particle analysis of ice crystal residuals observed in orographic wave clouds over Scandinavia during INTACC experiment

    Directory of Open Access Journals (Sweden)

    A. C. Targino

    2006-01-01

    Full Text Available Individual ice crystal residual particles collected over Scandinavia during the INTACC (INTeraction of Aerosol and Cold Clouds experiment in October 1999 were analyzed by Scanning Electron Microscopy (SEM equipped with Energy-Dispersive X-ray Analysis (EDX. Samples were collected onboard the British Met Office Hercules C-130 aircraft using a Counterflow Virtual Impactor (CVI. This study is based on six samples collected in orographic clouds. The main aim of this study is to characterize cloud residual elemental composition in conditions affected by different airmasses. In total 609 particles larger than 0.1 μm diameter were analyzed and their elemental composition and morphology were determined. Thereafter a hierarchical cluster analysis was performed on the signal detected with SEM-EDX in order to identify the major particle classes and their abundance. A cluster containing mineral dust, represented by aluminosilicates, Fe-rich and Si-rich particles, was the dominating class of particles, accounting for about 57.5% of the particles analyzed, followed by low-Z particles, 23.3% (presumably organic material and sea salt (6.7%. Sulfur was detected often across all groups, indicating ageing and in-cloud processing of particles. A detailed inspection of samples individually unveiled a relationship between ice crystal residual composition and airmass origin. Cloud residual samples from clean airmasses (that is, trajectories confined to the Atlantic and Arctic Oceans and/or with source altitude in the free troposphere were dominated primarily by low-Z and sea salt particles, while continentally-influenced airmasses (with trajectories that originated or traveled over continental areas and with source altitude in the continental boundary layer contained mainly mineral dust residuals. Comparison of residual composition for similar cloud ambient temperatures around –27°C revealed that supercooled clouds are more likely to persist in conditions where

  10. Nonparametric Tree-Based Predictive Modeling of Storm Outages on an Electric Distribution Network.

    Science.gov (United States)

    He, Jichao; Wanik, David W; Hartman, Brian M; Anagnostou, Emmanouil N; Astitha, Marina; Frediani, Maria E B

    2017-03-01

    This article compares two nonparametric tree-based models, quantile regression forests (QRF) and Bayesian additive regression trees (BART), for predicting storm outages on an electric distribution network in Connecticut, USA. We evaluated point estimates and prediction intervals of outage predictions for both models using high-resolution weather, infrastructure, and land use data for 89 storm events (including hurricanes, blizzards, and thunderstorms). We found that spatially BART predicted more accurate point estimates than QRF. However, QRF produced better prediction intervals for high spatial resolutions (2-km grid cells and towns), while BART predictions aggregated to coarser resolutions (divisions and service territory) more effectively. We also found that the predictive accuracy was dependent on the season (e.g., tree-leaf condition, storm characteristics), and that the predictions were most accurate for winter storms. Given the merits of each individual model, we suggest that BART and QRF be implemented together to show the complete picture of a storm's potential impact on the electric distribution network, which would allow for a utility to make better decisions about allocating prestorm resources. © 2016 Society for Risk Analysis.

  11. A comparison of parametric and nonparametric methods for normalising cDNA microarray data.

    Science.gov (United States)

    Khondoker, Mizanur R; Glasbey, Chris A; Worton, Bruce J

    2007-12-01

    Normalisation is an essential first step in the analysis of most cDNA microarray data, to correct for effects arising from imperfections in the technology. Loess smoothing is commonly used to correct for trends in log-ratio data. However, parametric models, such as the additive plus multiplicative variance model, have been preferred for scale normalisation, though the variance structure of microarray data may be of a more complex nature than can be accommodated by a parametric model. We propose a new nonparametric approach that incorporates location and scale normalisation simultaneously using a Generalised Additive Model for Location, Scale and Shape (GAMLSS, Rigby and Stasinopoulos, 2005, Applied Statistics, 54, 507-554). We compare its performance in inferring differential expression with Huber et al.'s (2002, Bioinformatics, 18, 96-104) arsinh variance stabilising transformation (AVST) using real and simulated data. We show GAMLSS to be as powerful as AVST when the parametric model is correct, and more powerful when the model is wrong. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  12. Evaluation of parametric and nonparametric models to predict water flow; Avaliacao entre modelos parametricos e nao parametricos para previsao de vazoes afluentes

    Energy Technology Data Exchange (ETDEWEB)

    Marques, T.C.; Cruz Junior, G.; Vinhal, C. [Universidade Federal de Goias (UFG), Goiania, GO (Brazil). Escola de Engenharia Eletrica e de Computacao], Emails: thyago@eeec.ufg.br, gcruz@eeec.ufg.br, vinhal@eeec.ufg.br

    2009-07-01

    The goal of this paper is to present a methodology to carry out the seasonal stream flow forecasting using database of average monthly inflows of one Brazilian hydroelectric plant located at Grande, Tocantins, Paranaiba, Sao Francisco and Iguacu river's. The model is based on the Adaptive Network Based Fuzzy Inference System (ANFIS), the non-parametric model. The performance of this model was compared with a periodic autoregressive model, the parametric model. The results show that the forecasting errors of the non-parametric model considered are significantly lower than the parametric model. (author)

  13. Analysis of residual stresses on the transverse beam of a casting stand by means of drilling method

    Directory of Open Access Journals (Sweden)

    P. Frankovský

    2014-10-01

    Full Text Available The presented paper demonstrates the application of drilling method in the analysis of residual stresses on the transverse beam of a casting stand. In the initial stage of the analysis the determination of strains was done for individual steps of drilling in the area which was determined by means of numerical analysis. The drilling was carried out gradually by 0,5 mm up to the depth of 5 mm, while the diameter of the drilled hole was 3,2 mm. During the analysis we used the drilling device RS-200, strain indicator P3 and SGD 1-RY21-3/120. The paper presents the development of residual stresses throughout the depth of the drilled hole which were determined according to standard ASTM E837-01, by means of integral method, power series method and by means of Power Series method.

  14. Quantification of Drive-Response Relationships Between Residues During Protein Folding.

    Science.gov (United States)

    Qi, Yifei; Im, Wonpil

    2013-08-13

    Mutual correlation and cooperativity are commonly used to describe residue-residue interactions in protein folding/function. However, these metrics do not provide any information on the causality relationships between residues. Such drive-response relationships are poorly studied in protein folding/function and difficult to measure experimentally due to technical limitations. In this study, using the information theory transfer entropy (TE) that provides a direct measurement of causality between two times series, we have quantified the drive-response relationships between residues in the folding/unfolding processes of four small proteins generated by molecular dynamics simulations. Instead of using a time-averaged single TE value, the time-dependent TE is measured with the Q-scores based on residue-residue contacts and with the statistical significance analysis along the folding/unfolding processes. The TE analysis is able to identify the driving and responding residues that are different from the highly correlated residues revealed by the mutual information analysis. In general, the driving residues have more regular secondary structures, are more buried, and show greater effects on the protein stability as well as folding and unfolding rates. In addition, the dominant driving and responding residues from the TE analysis on the whole trajectory agree with those on a single folding event, demonstrating that the drive-response relationships are preserved in the non-equilibrium process. Our study provides detailed insights into the protein folding process and has potential applications in protein engineering and interpretation of time-dependent residue-based experimental observables for protein function.

  15. X-ray residual stress analysis on machined and tempered HPSN-ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Immelmann, S.; Welle, E.; Reimers, W. [Hahn-Meitner-Institut Berlin GmbH (Germany)

    1997-11-15

    The residual stress state induced by grinding and tempering of hot pressed silicon nitride (HPSN) samples is studied by X-ray diffraction. The results reveal that the residual stress values at the surface of the samples as well as their gradient within the penetration depth of the X-rays depend on the sintering aid and thus, on the glassy phase content of the HPSN. Tempering of the ground HPSN reduces the residual stress values due to microplastic deformation, whereas an oxidation of the glassy phase leads to the formation of compressive residual stresses. (orig.) 35 refs.

  16. Evaluation of gas chromatography – electron ionization – full scan high resolution Orbitrap mass spectrometry for pesticide residue analysis

    International Nuclear Information System (INIS)

    Mol, Hans G.J.; Tienstra, Marc; Zomer, Paul

    2016-01-01

    Gas chromatography with electron ionization and full scan high resolution mass spectrometry with an Orbitrap mass analyzer (GC-EI-full scan Orbitrap HRMS) was evaluated for residue analysis. Pesticides in fruit and vegetables were taken as an example application. The relevant aspects for GC-MS based residue analysis, including the resolving power (15,000 to 120,000 FWHM at m/z 200), scan rate, dynamic range, selectivity, sensitivity, analyte identification, and utility of existing EI-libraries, are assessed and discussed in detail. The optimum acquisition conditions in full scan mode (m/z 50–500) were a resolving power of 60,000 and an automatic-gain-control target value of 3E6. These conditions provided (i) an optimum mass accuracy: within 2 ppm over a wide concentration range, with/without matrix, enabling the use of ±5 ppm mass extraction windows (ii) adequate scan speed: minimum 12 scans/peak, (iii) an intra-scan dynamic range sufficient to achieve LOD/LOQs ≤0.5 pg in fruit/vegetable matrices (corresponding to ≤0.5 μg kg"−"1) for most pesticides. EI-Orbitrap spectra were consistent over a very wide concentration range (5 orders) with good match values against NIST (EI-quadrupole) spectra. The applicability for quantitative residue analysis was verified by validation of 54 pesticides in three matrices (tomato, leek, orange) at 10 and 50 μg/kg. The method involved a QuEChERS-based extraction with a solvent switch into iso-octane, and 1 μL hot splitless injection into the GC-HRMS system. A recovery between 70 and 120% and a repeatability RSD <10% was obtained in most cases. Linearity was demonstrated for the range ≤5–250 μg kg"−"1. The pesticides could be identified according to the applicable EU criteria for GC-HRMS (SANTE/11945/2015). GC-EI-full scan Orbitrap HRMS was found to be highly suited for quantitative pesticide residue analysis. The potential of qualitative screening to extend the scope makes it an attractive alternative to

  17. Evaluation of gas chromatography – electron ionization – full scan high resolution Orbitrap mass spectrometry for pesticide residue analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mol, Hans G.J., E-mail: hans.mol@wur.nl; Tienstra, Marc; Zomer, Paul

    2016-09-07

    Gas chromatography with electron ionization and full scan high resolution mass spectrometry with an Orbitrap mass analyzer (GC-EI-full scan Orbitrap HRMS) was evaluated for residue analysis. Pesticides in fruit and vegetables were taken as an example application. The relevant aspects for GC-MS based residue analysis, including the resolving power (15,000 to 120,000 FWHM at m/z 200), scan rate, dynamic range, selectivity, sensitivity, analyte identification, and utility of existing EI-libraries, are assessed and discussed in detail. The optimum acquisition conditions in full scan mode (m/z 50–500) were a resolving power of 60,000 and an automatic-gain-control target value of 3E6. These conditions provided (i) an optimum mass accuracy: within 2 ppm over a wide concentration range, with/without matrix, enabling the use of ±5 ppm mass extraction windows (ii) adequate scan speed: minimum 12 scans/peak, (iii) an intra-scan dynamic range sufficient to achieve LOD/LOQs ≤0.5 pg in fruit/vegetable matrices (corresponding to ≤0.5 μg kg{sup −1}) for most pesticides. EI-Orbitrap spectra were consistent over a very wide concentration range (5 orders) with good match values against NIST (EI-quadrupole) spectra. The applicability for quantitative residue analysis was verified by validation of 54 pesticides in three matrices (tomato, leek, orange) at 10 and 50 μg/kg. The method involved a QuEChERS-based extraction with a solvent switch into iso-octane, and 1 μL hot splitless injection into the GC-HRMS system. A recovery between 70 and 120% and a repeatability RSD <10% was obtained in most cases. Linearity was demonstrated for the range ≤5–250 μg kg{sup −1}. The pesticides could be identified according to the applicable EU criteria for GC-HRMS (SANTE/11945/2015). GC-EI-full scan Orbitrap HRMS was found to be highly suited for quantitative pesticide residue analysis. The potential of qualitative screening to extend the scope makes it an attractive

  18. Development of residual stress prediction model in pipe weldment

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Yun Yong; Lim, Se Young; Choi, Kang Hyeuk; Cho, Young Sam; Lim, Jae Hyuk [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2002-03-15

    When Leak Before Break(LBB) concepts is applied to high energy piping of nuclear power plants, residual weld stresses is a important variable. The main purpose of his research is to develop the numerical model which can predict residual weld stresses. Firstly, basic theories were described which need to numerical analysis of welding parts. Before the analysis of pipe, welding of a flat plate was analyzed and compared. Appling the data of used pipes, thermal/mechanical analysis were accomplished and computed temperature gradient and residual stress distribution. For thermal analysis, proper heat flux was regarded as the heat source and convection/radiation heat transfer were considered at surfaces. The residual stresses were counted from the computed temperature gradient and they were compared and verified with a result of another research.

  19. A parametric interpretation of Bayesian Nonparametric Inference from Gene Genealogies: Linking ecological, population genetics and evolutionary processes.

    Science.gov (United States)

    Ponciano, José Miguel

    2017-11-22

    Using a nonparametric Bayesian approach Palacios and Minin (2013) dramatically improved the accuracy, precision of Bayesian inference of population size trajectories from gene genealogies. These authors proposed an extension of a Gaussian Process (GP) nonparametric inferential method for the intensity function of non-homogeneous Poisson processes. They found that not only the statistical properties of the estimators were improved with their method, but also, that key aspects of the demographic histories were recovered. The authors' work represents the first Bayesian nonparametric solution to this inferential problem because they specify a convenient prior belief without a particular functional form on the population trajectory. Their approach works so well and provides such a profound understanding of the biological process, that the question arises as to how truly "biology-free" their approach really is. Using well-known concepts of stochastic population dynamics, here I demonstrate that in fact, Palacios and Minin's GP model can be cast as a parametric population growth model with density dependence and environmental stochasticity. Making this link between population genetics and stochastic population dynamics modeling provides novel insights into eliciting biologically meaningful priors for the trajectory of the effective population size. The results presented here also bring novel understanding of GP as models for the evolution of a trait. Thus, the ecological principles foundation of Palacios and Minin (2013)'s prior adds to the conceptual and scientific value of these authors' inferential approach. I conclude this note by listing a series of insights brought about by this connection with Ecology. Copyright © 2017 The Author. Published by Elsevier Inc. All rights reserved.

  20. Pesticide residues in canned foods, fruits, and vegetables: the application of Supercritical Fluid Extraction and chromatographic techniques in the analysis.

    Science.gov (United States)

    El-Saeid, Mohamed H

    2003-12-11

    Multiple pesticide residues have been observed in some samples of canned foods, frozen vegetables, and fruit jam, which put the health of the consumers at risk of adverse effects. It is quite apparent that such a state of affairs calls for the need of more accurate, cost-effective, and rapid analytical techniques capable of detecting the minimum concentrations of the multiple pesticide residues. The aims of this paper were first, to determine the effectiveness of the use of Supercritical Fluid Extraction (SFE) and Supercritical Fluid Chromatography (SFC) techniques in the analysis of the levels of pesticide residues in canned foods, vegetables, and fruits; and second, to contribute to the promotion of consumer safety by excluding pesticide residue contamination from markets. Fifteen different types of imported canned and frozen fruits and vegetables samples obtained from the Houston local food markets were investigated. The major types of pesticides tested were pyrethroids, herbicides, fungicides, and carbamates. By using these techniques, the overall data showed 60.82% of the food samples had no detection of any pesticide residues under this investigation. On the other hand, 39.15% different food samples were contaminated by four different pyrethroid residues +/- RSD% ranging from 0.03 +/- 0.005 to 0.05 +/- 0.03 ppm, of which most of the pyrethroid residues were detected in frozen vegetables and strawberry jam. Herbicide residues in test samples ranged from 0.03 +/- 0.005 to 0.8 +/- 0.01 ppm. Five different fungicides, ranging from 0.05 +/- 0.02 to 0.8 +/- 0.1 ppm, were found in five different frozen vegetable samples. Carbamate residues were not detected in 60% of investigated food samples. It was concluded that SFE and SFC techniques were accurate, reliable, less time consuming, and cost effective in the analysis of imported canned foods, fruits, and vegetables and are recommended for the monitoring of pesticide contaminations.

  1. Longitudinal Analysis of Residual Feed Intake in Mink using Random Regression with Heterogeneous Residual Variance

    DEFF Research Database (Denmark)

    Shirali, Mahmoud; Nielsen, Vivi Hunnicke; Møller, Steen Henrik

    Heritability of residual feed intake (RFI) increased from low to high over the growing period in male and female mink. The lowest heritability for RFI (male: 0.04 ± 0.01 standard deviation (SD); female: 0.05 ± 0.01 SD) was in early and the highest heritability (male: 0.33 ± 0.02; female: 0.34 ± 0.......02 SD) was achieved at the late growth stages. The genetic correlation between different growth stages for RFI showed a high association (0.91 to 0.98) between early and late growing periods. However, phenotypic correlations were lower from 0.29 to 0.50. The residual variances were substantially higher...

  2. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  3. Estimation of the lifetime distribution of mechatronic systems in the presence of a covariate: A comparison among parametric, semiparametric and nonparametric models

    International Nuclear Information System (INIS)

    Bobrowski, Sebastian; Chen, Hong; Döring, Maik; Jensen, Uwe; Schinköthe, Wolfgang

    2015-01-01

    In practice manufacturers may have lots of failure data of similar products using the same technology basis under different operating conditions. Thus, one can try to derive predictions for the distribution of the lifetime of newly developed components or new application environments through the existing data using regression models based on covariates. Three categories of such regression models are considered: a parametric, a semiparametric and a nonparametric approach. First, we assume that the lifetime is Weibull distributed, where its parameters are modelled as linear functions of the covariate. Second, the Cox proportional hazards model, well-known in Survival Analysis, is applied. Finally, a kernel estimator is used to interpolate between empirical distribution functions. In particular the last case is new in the context of reliability analysis. We propose a goodness of fit measure (GoF), which can be applied to all three types of regression models. Using this GoF measure we discuss a new model selection procedure. To illustrate this method of reliability prediction, the three classes of regression models are applied to real test data of motor experiments. Further the performance of the approaches is investigated by Monte Carlo simulations. - Highlights: • We estimate the lifetime distribution in the presence of a covariate. • Three types of regression models are considered and compared. • A new nonparametric estimator based on our particular data structure is introduced. • We propose a goodness of fit measure and show a new model selection procedure. • A case study with real data and Monte Carlo simulations are performed

  4. Void analysis of target residues at SPS energy -evidence of correlation with fractal behaviour

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Das, Rupa . E-mail : dipakghosh_in@yahoo.com

    2007-01-01

    This paper presents an analysis of the target residues in 32 S -AgBr and 16 0 -AgBr interactions at 200 AGeV and 60AGeV respectively in terms of fractal moment by Takagi method and void probability scaling. The study reveals an interesting feature of the production process. In 16 O- AgBr interactions multifractal behaviour is present in both hemispheres and void probability does not show a scaling behaviour, but at high energy the situation changes. In 32 S -AgBr interactions for both hemisphere monofractal behaviour is indicated by that data and void probability also shows good scaling behaviour. This suggests that a possible correlation of void probability with fractal behaviour of target residues. (author)

  5. Method for the Collection, Gravimetric and Chemical Analysis of Nonvolatile Residue (NVR) on Surfaces

    Science.gov (United States)

    Gordon, Keith; Rutherford, Gugu; Aranda, Denisse

    2017-01-01

    Nonvolatile residue (NVR), sometimes referred to as molecular contamination is the term used for the total composition of the inorganic and high boiling point organic components in particulates and molecular films deposited on critical surfaces surrounding space structures, with the particulate and NVR contamination originating primarily from pre-launch operations. The "nonvolatile" suggestion from the terminology NVR implies that the collected residue will not experience much loss under ambient conditions. NVR has been shown to have a dramatic impact on the ability to perform optical measurements from platforms based in space. Such contaminants can be detected early by the controlled application of various detection techniques and contamination analyses. Contamination analyses are the techniques used to determine if materials, components, and subsystems can be expected to meet the performance requirements of a system. Of particular concern is the quantity of NVR contaminants that might be deposited on critical payload surfaces from these sources. Subsequent chemical analysis of the contaminant samples by infrared spectroscopy and gas chromatography mass spectrometry identifies the components, gives semi-quantitative estimates of contaminant thickness, indicates possible sources of the NVR, and provides guidance for effective cleanup procedures. In this report, a method for the collection and determination of the mass of NVR was generated by the authors at NASA Langley Research Center. This report describes the method developed and implemented for collecting NVR contaminants, and procedures for gravimetric and chemical analysis of the residue obtained. The result of this NVR analysis collaboration will help pave the way for Langley's ability to certify flight hardware outgassing requirements in support of flight projects such as Stratospheric Aerosol and Gas Experiment III (SAGE III), Clouds and the Earth's Radiant Energy System (CERES), Materials International

  6. Least-Squares Linear Regression and Schrodinger's Cat: Perspectives on the Analysis of Regression Residuals.

    Science.gov (United States)

    Hecht, Jeffrey B.

    The analysis of regression residuals and detection of outliers are discussed, with emphasis on determining how deviant an individual data point must be to be considered an outlier and the impact that multiple suspected outlier data points have on the process of outlier determination and treatment. Only bivariate (one dependent and one independent)…

  7. Nonparametric Methods in Astronomy: Think, Regress, Observe—Pick Any Three

    Science.gov (United States)

    Steinhardt, Charles L.; Jermyn, Adam S.

    2018-02-01

    Telescopes are much more expensive than astronomers, so it is essential to minimize required sample sizes by using the most data-efficient statistical methods possible. However, the most commonly used model-independent techniques for finding the relationship between two variables in astronomy are flawed. In the worst case they can lead without warning to subtly yet catastrophically wrong results, and even in the best case they require more data than necessary. Unfortunately, there is no single best technique for nonparametric regression. Instead, we provide a guide for how astronomers can choose the best method for their specific problem and provide a python library with both wrappers for the most useful existing algorithms and implementations of two new algorithms developed here.

  8. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    Science.gov (United States)

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  9. Bayesian Nonparametric Measurement of Factor Betas and Clustering with Application to Hedge Fund Returns

    Directory of Open Access Journals (Sweden)

    Urbi Garay

    2016-03-01

    Full Text Available We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return and betas (to a choice set of explanatory factors in a multivariate setting. This approach, as well as the outputs, has a dynamic, nonstationary and nonparametric form, which circumvents the problem of model risk and parametric assumptions that the Kalman filter and other widely used approaches rely on. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. This approach exhibits a smaller Root Mean Squared Error than traditionally used benchmarks in financial settings, which we illustrate through simulation. As an illustration, we use hedge fund index data, and find that our estimated alphas are, on average, 0.13% per month higher (1.6% per year than alphas estimated through Ordinary Least Squares. The approach exhibits fast adaptation to abrupt changes in the parameters, as seen in our estimated alphas and betas, which exhibit high volatility, especially in periods which can be identified as times of stressful market events, a reflection of the dynamic positioning of hedge fund portfolio managers.

  10. Analysis of residual stress relief mechanisms in post-weld heat treatment

    International Nuclear Information System (INIS)

    Dong, Pingsha; Song, Shaopin; Zhang, Jinmiao

    2014-01-01

    This paper presents a recent study on weld residual stress relief mechanisms associated with furnace-based uniform post-weld heat treatment (PWHT). Both finite element and analytical methods are used to quantitatively examine how plastic deformation and creep relaxation contribute to residual stress relief process at different stages of PWHT process. The key contribution of this work to an improved understanding of furnace based uniform PWHT can be summarized as follows: (1)Plastic deformation induced stress relief during PWHT can be analytically expressed by the change in material elastic deformation capacity (or elastic deformation limit) measured in terms of material yield strength to Young's modulus ratio, which has a rather limited role in overall residual stress relief during furnace based uniform PWHT. (2)The most dominant stress relief mechanism is creep strain induced stress relaxation, as expected. However, a rapid creep strain development accompanied by a rapid residual stress reduction during heating stage before reaching PWHT temperature is shown to contribute to most of the stress relief seen in overall PWHT process, suggesting PWHT hold time can be significantly reduced as far as residual stress relief is concerned. (3)A simple engineering scheme for estimating residual stress reduction is proposed based on this study by relating material type, PWHT temperature, and component wall thickness. - Highlights: • The paper clarified effects of plastic deformation and creep relaxation on weld residual stress relief during uniform PWHT. • Creep strain development is far more important than plastic strain, mostly completed even before hold time starts. • Plastic strain development is insignificant and be analytically described by a material elastic deformation capacity parameter. • An engineering estimation scheme is proposed for determining residual stress reduction resulted from furnace based PWHT

  11. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  12. A Unified Discussion on the Concept of Score Functions Used in the Context of Nonparametric Linkage Analysis

    Directory of Open Access Journals (Sweden)

    Lars Ängquist

    2008-01-01

    Full Text Available In this article we try to discuss nonparametric linkage (NPL score functions within a broad and quite general framework. The main focus of the paper is the structure, derivation principles and interpretations of the score function entity itself. We define and discuss several families of one-locus score function definitions, i.e. the implicit, explicit and optimal ones. Some generalizations and comments to the two-locus, unconditional and conditional, cases are included as well. Although this article mainly aims at serving as an overview, where the concept of score functions are put into a covering context, we generalize the noncentrality parameter (NCP optimal score functions in Ängquist et al. (2007 to facilitate—through weighting—for incorporation of several plausible distinct genetic models. Since the genetic model itself most oftenly is to some extent unknown this facilitates weaker prior assumptions with respect to plausible true disease models without loosing the property of NCP-optimality. Moreover, we discuss general assumptions and properties of score functions in the above sense. For instance, the concept of identical by descent (IBD sharing structures and score function equivalence are discussed in some detail.

  13. Parametric and Nonparametric EEG Analysis for the Evaluation of EEG Activity in Young Children with Controlled Epilepsy

    Directory of Open Access Journals (Sweden)

    Vangelis Sakkalis

    2008-01-01

    Full Text Available There is an important evidence of differences in the EEG frequency spectrum of control subjects as compared to epileptic subjects. In particular, the study of children presents difficulties due to the early stages of brain development and the various forms of epilepsy indications. In this study, we consider children that developed epileptic crises in the past but without any other clinical, psychological, or visible neurophysiological findings. The aim of the paper is to develop reliable techniques for testing if such controlled epilepsy induces related spectral differences in the EEG. Spectral features extracted by using nonparametric, signal representation techniques (Fourier and wavelet transform and a parametric, signal modeling technique (ARMA are compared and their effect on the classification of the two groups is analyzed. The subjects performed two different tasks: a control (rest task and a relatively difficult math task. The results show that spectral features extracted by modeling the EEG signals recorded from individual channels by an ARMA model give a higher discrimination between the two subject groups for the control task, where classification scores of up to 100% were obtained with a linear discriminant classifier.

  14. Analysis of core-periphery organization in protein contact networks reveals groups of structurally and functionally critical residues.

    Science.gov (United States)

    Isaac, Arnold Emerson; Sinha, Sitabhra

    2015-10-01

    The representation of proteins as networks of interacting amino acids, referred to as protein contact networks (PCN), and their subsequent analyses using graph theoretic tools, can provide novel insights into the key functional roles of specific groups of residues. We have characterized the networks corresponding to the native states of 66 proteins (belonging to different families) in terms of their core-periphery organization. The resulting hierarchical classification of the amino acid constituents of a protein arranges the residues into successive layers - having higher core order - with increasing connection density, ranging from a sparsely linked periphery to a densely intra-connected core (distinct from the earlier concept of protein core defined in terms of the three-dimensional geometry of the native state, which has least solvent accessibility). Our results show that residues in the inner cores are more conserved than those at the periphery. Underlining the functional importance of the network core, we see that the receptor sites for known ligand molecules of most proteins occur in the innermost core. Furthermore, the association of residues with structural pockets and cavities in binding or active sites increases with the core order. From mutation sensitivity analysis, we show that the probability of deleterious or intolerant mutations also increases with the core order. We also show that stabilization centre residues are in the innermost cores, suggesting that the network core is critically important in maintaining the structural stability of the protein. A publicly available Web resource for performing core-periphery analysis of any protein whose native state is known has been made available by us at http://www.imsc.res.in/ ~sitabhra/proteinKcore/index.html.

  15. Nonparametric estimation for censored mixture data with application to the Cooperative Huntington’s Observational Research Trial

    Science.gov (United States)

    Wang, Yuanjia; Garcia, Tanya P.; Ma, Yanyuan

    2012-01-01

    This work presents methods for estimating genotype-specific distributions from genetic epidemiology studies where the event times are subject to right censoring, the genotypes are not directly observed, and the data arise from a mixture of scientifically meaningful subpopulations. Examples of such studies include kin-cohort studies and quantitative trait locus (QTL) studies. Current methods for analyzing censored mixture data include two types of nonparametric maximum likelihood estimators (NPMLEs) which do not make parametric assumptions on the genotype-specific density functions. Although both NPMLEs are commonly used, we show that one is inefficient and the other inconsistent. To overcome these deficiencies, we propose three classes of consistent nonparametric estimators which do not assume parametric density models and are easy to implement. They are based on the inverse probability weighting (IPW), augmented IPW (AIPW), and nonparametric imputation (IMP). The AIPW achieves the efficiency bound without additional modeling assumptions. Extensive simulation experiments demonstrate satisfactory performance of these estimators even when the data are heavily censored. We apply these estimators to the Cooperative Huntington’s Observational Research Trial (COHORT), and provide age-specific estimates of the effect of mutation in the Huntington gene on mortality using a sample of family members. The close approximation of the estimated non-carrier survival rates to that of the U.S. population indicates small ascertainment bias in the COHORT family sample. Our analyses underscore an elevated risk of death in Huntington gene mutation carriers compared to non-carriers for a wide age range, and suggest that the mutation equally affects survival rates in both genders. The estimated survival rates are useful in genetic counseling for providing guidelines on interpreting the risk of death associated with a positive genetic testing, and in facilitating future subjects at risk

  16. [Nonparametric method of estimating survival functions containing right-censored and interval-censored data].

    Science.gov (United States)

    Xu, Yonghong; Gao, Xiaohuan; Wang, Zhengxi

    2014-04-01

    Missing data represent a general problem in many scientific fields, especially in medical survival analysis. Dealing with censored data, interpolation method is one of important methods. However, most of the interpolation methods replace the censored data with the exact data, which will distort the real distribution of the censored data and reduce the probability of the real data falling into the interpolation data. In order to solve this problem, we in this paper propose a nonparametric method of estimating the survival function of right-censored and interval-censored data and compare its performance to SC (self-consistent) algorithm. Comparing to the average interpolation and the nearest neighbor interpolation method, the proposed method in this paper replaces the right-censored data with the interval-censored data, and greatly improves the probability of the real data falling into imputation interval. Then it bases on the empirical distribution theory to estimate the survival function of right-censored and interval-censored data. The results of numerical examples and a real breast cancer data set demonstrated that the proposed method had higher accuracy and better robustness for the different proportion of the censored data. This paper provides a good method to compare the clinical treatments performance with estimation of the survival data of the patients. This pro vides some help to the medical survival data analysis.

  17. LC-MS/MS analysis of neonicotinoid insecticides: Residue findings in chilean honeys

    Directory of Open Access Journals (Sweden)

    Raquel Bridi

    Full Text Available ABSTRACT Neonicotinoids are a relatively new generation of insecticides that have been used for control of pests such as aphids, leafhoppers and whiteflies. This paper presents for the first time a determination of residues of four neonicotinoid insecticides (acetamiprid, thiamethoxam, thiacloprid and imidacloprid in Chilean honey using QuEChERS extraction and UHPLC-MS/MS analysis. The limits of detection and quantification found for all analytes ranging from 0.34 to 1.43 μg kg-1 and from 0.30 to 4.76 μg kg-1, respectively. The extraction using QuEChERS method provided recoveries over 79% and the precision showed coefficient of variation lower than 20%. These data are in agreement with the international criteria that recommend general recovery limits of 70 - 120%. Of the 16 samples analyzed, in three honey samples neonicotinoids pesticides were detected. These three samples were collected from the same geographical area (Rengo. Fruit and grain production characterize the province of Rengo. The analysis of the botanical origin of these honeys showed the absence of pollen grains of crops and the majority presence of pollen grains of weeds such as Medicago sativa, Galega officinalis and Brassica rapa, which could be associated with crops. Although the residue levels found were low, the results also confirm the actual occurrence of a transfer of neonicotinoid insecticides from exposed honeybees into honey.

  18. DETERMINATION OF RESIDUAL VALUE WITHIN THE COST BENEFIT ANALYSIS FOR THE PROJECTS FINANCED BY THE EUROPEAN UNION

    Directory of Open Access Journals (Sweden)

    Droj Laurentiu

    2011-12-01

    Full Text Available This paper will be later used within the Doctoral thesis: The Mechanism of Financing Investment Projects by Usage of European Structural Funds, which is currently under development at the University Babes Bolyai Cluj Napoca, Faculty of Economics and Business Management, under the coordination of the prof. univ. dr. Ioan Trenca. An increasing debate is rising recently between the academic community, the business community, the private lending institutions(banks, investment funds, etc. and the officials of the Romanian Government and of the European Union regarding the proposed method for calculation of the residual value in the European financed investment projects. Several methods of calculation of the Residual Value were taken into consideration and contested by different parties in order to prepare and to submit financial analysis studies for investment projects proposed to be financed within the European Regional Development Fund(ERDF. In this context, the present paper proposes to address the three main methods of calculation of the residual value and later to study its impact over the indicators, especially over the Internal Rate of Return, obtained in the financial analysis for an investment project proposed by a Romanian medium sized company. In order to establish the proper method which should be used for selection and calculation of the residual value previously published studies and official documentations were analyzed. The main methods for calculation of the residual values were identified as being the following: A. the residual market value of fixed assets, as if it were to be sold, B. accounting economic depreciation formula and C. by using the net present value of the cash flows. Based on these methods the research model was elaborated, and using the financial data of the proposed infrastructure investment was created a case study. According to the realized study a pattern was established for proper determination of residual value

  19. Performance of non-parametric algorithms for spatial mapping of tropical forest structure

    Directory of Open Access Journals (Sweden)

    Liang Xu

    2016-08-01

    Full Text Available Abstract Background Mapping tropical forest structure is a critical requirement for accurate estimation of emissions and removals from land use activities. With the availability of a wide range of remote sensing imagery of vegetation characteristics from space, development of finer resolution and more accurate maps has advanced in recent years. However, the mapping accuracy relies heavily on the quality of input layers, the algorithm chosen, and the size and quality of inventory samples for calibration and validation. Results By using airborne lidar data as the “truth” and focusing on the mean canopy height (MCH as a key structural parameter, we test two commonly-used non-parametric techniques of maximum entropy (ME and random forest (RF for developing maps over a study site in Central Gabon. Results of mapping show that both approaches have improved accuracy with more input layers in mapping canopy height at 100 m (1-ha pixels. The bias-corrected spatial models further improve estimates for small and large trees across the tails of height distributions with a trade-off in increasing overall mean squared error that can be readily compensated by increasing the sample size. Conclusions A significant improvement in tropical forest mapping can be achieved by weighting the number of inventory samples against the choice of image layers and the non-parametric algorithms. Without future satellite observations with better sensitivity to forest biomass, the maps based on existing data will remain slightly biased towards the mean of the distribution and under and over estimating the upper and lower tails of the distribution.

  20. Doubly Nonparametric Sparse Nonnegative Matrix Factorization Based on Dependent Indian Buffet Processes.

    Science.gov (United States)

    Xuan, Junyu; Lu, Jie; Zhang, Guangquan; Xu, Richard Yi Da; Luo, Xiangfeng

    2018-05-01

    Sparse nonnegative matrix factorization (SNMF) aims to factorize a data matrix into two optimized nonnegative sparse factor matrices, which could benefit many tasks, such as document-word co-clustering. However, the traditional SNMF typically assumes the number of latent factors (i.e., dimensionality of the factor matrices) to be fixed. This assumption makes it inflexible in practice. In this paper, we propose a doubly sparse nonparametric NMF framework to mitigate this issue by using dependent Indian buffet processes (dIBP). We apply a correlation function for the generation of two stick weights associated with each column pair of factor matrices while still maintaining their respective marginal distribution specified by IBP. As a consequence, the generation of two factor matrices will be columnwise correlated. Under this framework, two classes of correlation function are proposed: 1) using bivariate Beta distribution and 2) using Copula function. Compared with the single IBP-based NMF, this paper jointly makes two factor matrices nonparametric and sparse, which could be applied to broader scenarios, such as co-clustering. This paper is seen to be much more flexible than Gaussian process-based and hierarchial Beta process-based dIBPs in terms of allowing the two corresponding binary matrix columns to have greater variations in their nonzero entries. Our experiments on synthetic data show the merits of this paper compared with the state-of-the-art models in respect of factorization efficiency, sparsity, and flexibility. Experiments on real-world data sets demonstrate the efficiency of this paper in document-word co-clustering tasks.

  1. Nonparametric modeling of US interest rate term structure dynamics and implications on the prices of derivative securities

    NARCIS (Netherlands)

    Jiang, GJ

    1998-01-01

    This paper develops a nonparametric model of interest rate term structure dynamics based an a spot rate process that permits only positive interest rates and a market price of interest rate risk that precludes arbitrage opportunities. Both the spot rate process and the market price of interest rate

  2. Isolation of residuals using trend surface analysis to magnetic data ...

    African Journals Online (AJOL)

    Polynomial surfaces of various degrees are fitted to a magnetic data of Awo area, southwestern Nigeria with the aim of isolating the residuals of the area associated with mineralogy. The fourth degree surface correlates better with the magnetic map of the study area. The residualized data were obtained by subtracting the ...

  3. A non-parametric peak calling algorithm for DamID-Seq.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    Full Text Available Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS of double sex (DSX-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq. One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only. After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1 reads resampling; 2 reads scaling (normalization and computing signal-to-noise fold changes; 3 filtering; 4 Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC. We also used irreproducible discovery rate (IDR analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  4. A non-parametric peak calling algorithm for DamID-Seq.

    Science.gov (United States)

    Li, Renhua; Hempel, Leonie U; Jiang, Tingbo

    2015-01-01

    Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  5. Theoretical conformational analysis of the bovine adrenal medulla 12 residue peptide molecule

    Science.gov (United States)

    Akhmedov, N. A.; Tagiyev, Z. H.; Hasanov, E. M.; Akverdieva, G. A.

    2003-02-01

    The spatial structure and conformational properties of the bovine adrenal medulla 12 residue peptide Tyr1-Gly2-Gly3-Phe4-Met5-Arg6-Arg7-Val8-Gly9-Arg10-Pro11-Glu12 (BAM-12P) molecule were studied by theoretical conformational analysis. It is revealed that this molecule can exist in several stable states. The energy and geometrical parameters for the low-energy conformations are obtained. The conformationally rigid and labile segments of this molecule were revealed.

  6. Comparative Analysis of Processes for Recovery of Rare Earths from Bauxite Residue

    Science.gov (United States)

    Borra, Chenna Rao; Blanpain, Bart; Pontikes, Yiannis; Binnemans, Koen; Van Gerven, Tom

    2016-11-01

    Environmental concerns and lack of space suggest that the management of bauxite residue needs to be re-adressed. The utilization of the residue has thus become a topic high on the agenda for both academia and industry, yet, up to date, it is only rarely used. Nonetheless, recovery of rare earth elements (REEs) with or without other metals from bauxite residue, and utilization of the left-over residue in other applications like building materials may be a viable alternative to storage. Hence, different processes developed by the authors for recovery of REEs and other metals from bauxite residue were compared. In this study, preliminary energy and cost analyses were carried out to assess the feasibility of the processes. These analyses show that the combination of alkali roasting-smelting-quenching-leaching is a promising process for the treatment of bauxite residue and that it is justified to study this process at a pilot scale.

  7. Mid-infrared spectroscopy and multivariate analysis for determination of tetracycline residues in cow's milk

    Directory of Open Access Journals (Sweden)

    Lizeth Mariel Casarrubias-Torres

    2018-01-01

    Full Text Available Mid-infrared spectroscopy and chemometric analysis were tested to determine tetracycline's residues in cow's milk. Cow's milk samples (n = 30 were spiked with tetracycline, chlortetracycline, and oxytetracycline in the range of 10-400 µg/l. Chemometric models to quantify each of the tetracycline's residues were developed by applying Partial Components Regression and Partial Least Squares algorithms. The Soft Independent Modeling of Class Analogy model was used to differentiate between pure milk and milk sample with tetracycline residues. The best models for predicting the levels of these antibiotics were obtained using Partial Least Square 1 algorithm (coefficient of determination between 0.997-0.999 and the standard error of calibration from 1.81 to 2.95. The Soft Independent Modeling of Class Analogy model showed well-separated groups allowing classification of milk samples and milk sample with antibiotics. The obtained results demonstrate the great analytical potential of chemometrics coupled with mid-infrared spectroscopy for the prediction of antibiotic in cow's milk at a concentration of microgram per litre (µg/l. This technique can be used to verify the safety of the milk rapidly and reliably.

  8. Analysis of martensitic transformation and residual tension in an 304L stainless steel; Analise da transformacao martensitica e tensao residual em um aco inoxidavel 304L

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Juciane Maria

    2014-07-01

    The relationship between plastic deformation and the strain induced phase transformation, that provides a practical route to the development of new engineering materials with excellent mechanical properties, characterize the TRIP effect 'Transformation Induced Plasticity'. Among the stainless steels, the metastable 304 L austenitic steel is susceptible to transformation of austenite-martensite phase from tensile tests at room temperature by increments of plastic deformation. It is of great technological and scientific interest the knowledge of the evolution of phase transformation and residual stress from different levels and rates of plastic deformation imposed to the material. It is also important to evaluate the interference of metallographic preparation in quantitative analyzes of this steel. The main techniques used in this study consisted of X-rays diffraction and Ferritoscopy for the quantitation phase, and XRD to residual stress analysis also. As observed, the phase transformation quantification has not suffered significant influence of the metallographic preparation and evolved from increments of plastic deformation due to different stop charges and strain rates, leading to a further strengthening of the austenite matrix. The evaluation of residual stress resulting from the martensitic transformation was susceptible to the metallographic preparation and increased its value on comparison to sample without metallographic preparation. It was also observed that the residual stress decreased with the increase of the fraction of transformed martensite. (author)

  9. Quadratic residues and non-residues selected topics

    CERN Document Server

    Wright, Steve

    2016-01-01

    This book offers an account of the classical theory of quadratic residues and non-residues with the goal of using that theory as a lens through which to view the development of some of the fundamental methods employed in modern elementary, algebraic, and analytic number theory. The first three chapters present some basic facts and the history of quadratic residues and non-residues and discuss various proofs of the Law of Quadratic Reciprosity in depth, with an emphasis on the six proofs that Gauss published. The remaining seven chapters explore some interesting applications of the Law of Quadratic Reciprocity, prove some results concerning the distribution and arithmetic structure of quadratic residues and non-residues, provide a detailed proof of Dirichlet’s Class-Number Formula, and discuss the question of whether quadratic residues are randomly distributed. The text is a valuable resource for graduate and advanced undergraduate students as well as for mathematicians interested in number theory.

  10. Pesticide Residues in Canned Foods, Fruits, and Vegetables: The Application of Supercritical Fluid Extraction and Chromatographic Techniques in the Analysis

    Directory of Open Access Journals (Sweden)

    Mohamed H. EL-Saeid

    2003-01-01

    Full Text Available Multiple pesticide residues have been observed in some samples of canned foods, frozen vegetables, and fruit jam, which put the health of the consumers at risk of adverse effects. It is quite apparent that such a state of affairs calls for the need of more accurate, cost-effective, and rapid analytical techniques capable of detecting the minimum concentrations of the multiple pesticide residues. The aims of this paper were first, to determine the effectiveness of the use of Supercritical Fluid Extraction (SFE and Supercritical Fluid Chromatography (SFC techniques in the analysis of the levels of pesticide residues in canned foods, vegetables, and fruits; and second, to contribute to the promotion of consumer safety by excluding pesticide residue contamination from markets. Fifteen different types of imported canned and frozen fruits and vegetables samples obtained from the Houston local food markets were investigated. The major types of pesticides tested were pyrethroids, herbicides, fungicides, and carbamates.By using these techniques, the overall data showed 60.82% of the food samples had no detection of any pesticide residues under this investigation. On the other hand, 39.15% different food samples were contaminated by four different pyrethroid residues ± RSD% ranging from 0.03 ± 0.005 to 0.05 ± 0.03 ppm, of which most of the pyrethroid residues were detected in frozen vegetables and strawberry jam. Herbicide residues in test samples ranged from 0.03 ± 0.005 to 0.8 ± 0.01 ppm. Five different fungicides, ranging from 0.05 ± 0.02 to 0.8 ±0.1 ppm, were found in five different frozen vegetable samples. Carbamate residues were not detected in 60% of investigated food samples. It was concluded that SFE and SFC techniques were accurate, reliable, less time consuming, and cost effective in the analysis of imported canned foods, fruits, and vegetables and are recommended for the monitoring of pesticide contaminations.

  11. Residues in food derived from animals

    International Nuclear Information System (INIS)

    Grossklaus, D.

    1989-01-01

    The first chapter presents a survey of fundamentals and methods of the detection and analysis of residues in food derived from animals, also referring to the resulting health hazards to man, and to the relevant legal provisions. The subsequent chapters have been written by experts of the Federal Health Office, each dealing with particular types of residues such as those of veterinary drugs, additives to animal feeds, pesticide residues, and with environmental pollutants and the contamination of animal products with radionuclides. (MG) With 35 figs., 61 tabs [de

  12. Effect of residual stress on the integrity of a branch connection

    International Nuclear Information System (INIS)

    Law, M.; Kirstein, O.; Luzin, V.

    2012-01-01

    A new connection to an existing gas pipeline was made by hot-tapping, welding directly onto a pressurised pipeline. The welds were not post-weld heat treated, causing significant residual stresses. The critical weld had residual stresses determined by neutron diffraction using ANSTO's residual stress diffractometer, Kowari. The maximum measured residual stress (290 MPa) was 60% of the yield strength. The magnitudes of errors from a number of sources were estimated. An integrity assessment of the welded branch connection was performed with the measured residual stress values and with residual stress distributions from the BS 7910 and API 579 analysis codes. Analysis using estimates of residual stress from API 579 overestimated the critical crack size. Highlights: ► Residual stresses were measured by neutron diffraction in a thick section, non post-weld heat treated ferritic weld. ► There is little published data on these welds. ► The work compares the measured residual stresses with code-based residual stress distributions.

  13. [Do we always correctly interpret the results of statistical nonparametric tests].

    Science.gov (United States)

    Moczko, Jerzy A

    2014-01-01

    Mann-Whitney, Wilcoxon, Kruskal-Wallis and Friedman tests create a group of commonly used tests to analyze the results of clinical and laboratory data. These tests are considered to be extremely flexible and their asymptotic relative efficiency exceeds 95 percent. Compared with the corresponding parametric tests they do not require checking the fulfillment of the conditions such as the normality of data distribution, homogeneity of variance, the lack of correlation means and standard deviations, etc. They can be used both in the interval and or-dinal scales. The article presents an example Mann-Whitney test, that does not in any case the choice of these four nonparametric tests treated as a kind of gold standard leads to correct inference.

  14. Multi-pesticides residue analysis of grains using modified magnetic nanoparticle adsorbent for facile and efficient cleanup.

    Science.gov (United States)

    Liu, Zhenzhen; Qi, Peipei; Wang, Xiangyun; Wang, Zhiwei; Xu, Xiahong; Chen, Wenxue; Wu, Liyu; Zhang, Hu; Wang, Qiang; Wang, Xinquan

    2017-09-01

    A facile, rapid sample pretreatment method was developed based on magnetic nanoparticles for multi-pesticides residue analysis of grains. Magnetite (Fe 3 O 4 ) nanoparticles modified with 3-(N,N-diethylamino)propyltrimethoxysilane (Fe 3 O 4 -PSA) and commercial C18 were selected as the cleanup adsorbents to remove the target interferences of the matrix, such as fatty acids and non-polar compounds. Rice was used as the representative grain sample for method optimization. The amount of Fe 3 O 4 -PSA and C18 were systematically investigated for selecting the suitable purification conditions, and the simultaneous determination of 50 pesticides and 8 related metabolites in rice was established by liquid chromatography-tandem mass spectrometry. Under the optimal conditions, the method validation was performed including linearity, sensitivity, matrix effect, recovery and precision, which all satisfy the requirement for pesticides residue analysis. Compared to the conventional QuEChERS method with non-magnetic material as cleanup adsorbent, the present method can save 30% of the pretreatment time, giving the high throughput analysis possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Nonparametric predictive inference for combined competing risks data

    International Nuclear Information System (INIS)

    Coolen-Maturi, Tahani; Coolen, Frank P.A.

    2014-01-01

    The nonparametric predictive inference (NPI) approach for competing risks data has recently been presented, in particular addressing the question due to which of the competing risks the next unit will fail, and also considering the effects of unobserved, re-defined, unknown or removed competing risks. In this paper, we introduce how the NPI approach can be used to deal with situations where units are not all at risk from all competing risks. This may typically occur if one combines information from multiple samples, which can, e.g. be related to further aspects of units that define the samples or groups to which the units belong or to different applications where the circumstances under which the units operate can vary. We study the effect of combining the additional information from these multiple samples, so effectively borrowing information on specific competing risks from other units, on the inferences. Such combination of information can be relevant to competing risks scenarios in a variety of application areas, including engineering and medical studies

  16. Identifying Plant Part Composition of Forest Logging Residue Using Infrared Spectral Data and Linear Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Gifty E. Acquah

    2016-08-01

    Full Text Available As new markets, technologies and economies evolve in the low carbon bioeconomy, forest logging residue, a largely untapped renewable resource will play a vital role. The feedstock can however be variable depending on plant species and plant part component. This heterogeneity can influence the physical, chemical and thermochemical properties of the material, and thus the final yield and quality of products. Although it is challenging to control compositional variability of a batch of feedstock, it is feasible to monitor this heterogeneity and make the necessary changes in process parameters. Such a system will be a first step towards optimization, quality assurance and cost-effectiveness of processes in the emerging biofuel/chemical industry. The objective of this study was therefore to qualitatively classify forest logging residue made up of different plant parts using both near infrared spectroscopy (NIRS and Fourier transform infrared spectroscopy (FTIRS together with linear discriminant analysis (LDA. Forest logging residue harvested from several Pinus taeda (loblolly pine plantations in Alabama, USA, were classified into three plant part components: clean wood, wood and bark and slash (i.e., limbs and foliage. Five-fold cross-validated linear discriminant functions had classification accuracies of over 96% for both NIRS and FTIRS based models. An extra factor/principal component (PC was however needed to achieve this in FTIRS modeling. Analysis of factor loadings of both NIR and FTIR spectra showed that, the statistically different amount of cellulose in the three plant part components of logging residue contributed to their initial separation. This study demonstrated that NIR or FTIR spectroscopy coupled with PCA and LDA has the potential to be used as a high throughput tool in classifying the plant part makeup of a batch of forest logging residue feedstock. Thus, NIR/FTIR could be employed as a tool to rapidly probe/monitor the variability

  17. Two non-parametric methods for derivation of constraints from radiotherapy dose–histogram data

    International Nuclear Information System (INIS)

    Ebert, M A; Kennedy, A; Joseph, D J; Gulliford, S L; Buettner, F; Foo, K; Haworth, A; Denham, J W

    2014-01-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose–histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization. (note)

  18. [Situation analysis and standard formulation of pesticide residues in traditional Chinese medicines].

    Science.gov (United States)

    Yang, Wan-Zhen; Kang, Chuan-Zhi; Ji, Rui-Feng; Zhou, L I; Wang, Sheng; Li, Zhen-Hao; Ma, Zhong-Hua; Guo, Lan-Ping

    2017-06-01

    Chinese Pharmacopoeia provides nine pesticide Maximum Residual Limits(MRLs) of traditional Chinese medicines(TCMs), The number of pesticides used in production are far more than those listed in pharmacopoeia. The lack of the standards make it's hard to reflect the real situation of pesticide residues in TCMs correctly. The paper is aimed to analyze the data of pesticide residues in TCMs from 7 089 items in 140 reports, and judging the exceedance rate of pesticides in TCMs using the MRLs of European pharmacopoeia,which is widely accepted in many countries. The results show that:①Pesticide residues in 18 kinds of TCMs are higher than MRLs,while in 137 kinds are below MRLs, such as Atractylodis Macrocephalae Rhizoma, Menthae Haplocalycis Herba and Fritillariae Thunbergii Bulbus. The average exceedance rate of all TCMs is 1.72%. The average exceedance rates of organochlorine, organophosphorus and pyrethroid are 2.26%, 1.51%, 0.37%,respectively. ②The average exceedance rate of pesticides is 2.00%, and the exceedance rate is more than 5%, accounting for 8.33%, the exceedance rate is between 1%-5%, accounting for 18.75%. the exceedance rate is between 0%-1%, accounting for 18.75%. The remaining 29 kinds of pesticides were not exceeded, accounting for 60.42%.Some reports like Greenpeace's organization exaggerated the pesticide residues in TCMs.But the pesticide residue question is still worthy of attention, so we proposed to amend the Chinese Pharmacopoeia pesticide residues standards, to increase the pesticide species of traditional Chinese medicine in production on the basis of retaining the existing types of pesticide residues, to strengthen the system research of pesticide residues in TCMs, providing a basis for making standard and promoting import and export trade in TCMs. Copyright© by the Chinese Pharmaceutical Association.

  19. Bayesian nonparametric generative models for causal inference with missing at random covariates.

    Science.gov (United States)

    Roy, Jason; Lum, Kirsten J; Zeldow, Bret; Dworkin, Jordan D; Re, Vincent Lo; Daniels, Michael J

    2018-03-26

    We propose a general Bayesian nonparametric (BNP) approach to causal inference in the point treatment setting. The joint distribution of the observed data (outcome, treatment, and confounders) is modeled using an enriched Dirichlet process. The combination of the observed data model and causal assumptions allows us to identify any type of causal effect-differences, ratios, or quantile effects, either marginally or for subpopulations of interest. The proposed BNP model is well-suited for causal inference problems, as it does not require parametric assumptions about the distribution of confounders and naturally leads to a computationally efficient Gibbs sampling algorithm. By flexibly modeling the joint distribution, we are also able to impute (via data augmentation) values for missing covariates within the algorithm under an assumption of ignorable missingness, obviating the need to create separate imputed data sets. This approach for imputing the missing covariates has the additional advantage of guaranteeing congeniality between the imputation model and the analysis model, and because we use a BNP approach, parametric models are avoided for imputation. The performance of the method is assessed using simulation studies. The method is applied to data from a cohort study of human immunodeficiency virus/hepatitis C virus co-infected patients. © 2018, The International Biometric Society.

  20. Pesticide residue analysis of soil, water, and grain of IPM basmati rice.

    Science.gov (United States)

    Arora, Sumitra; Mukherji, Irani; Kumar, Aman; Tanwar, R K

    2014-12-01

    The main aim of the present investigations was to compare the pesticide load in integrated pest management (IPM) with non-IPM crops of rice fields. The harvest samples of Basmati rice grain, soil, and irrigation water, from IPM and non-IPM field trials, at villages in northern India, were analyzed using multi-pesticide residue method. The field experiments were conducted for three consecutive years (2008-2011) for the successful validation of the modules, synthesized for Basmati rice, at these locations. Residues of tricyclazole, propiconazole, hexconazole, lambda cyhalothrin, pretilachlor chlorpyrifos, DDVP, carbendazim, and imidacloprid were analyzed from two locations, Dudhli village of Dehradun, Uttrakhand and Saboli and Aterna village of Sonepat, Haryana. The pesticide residues were observed below detectable limit (BDL) (water samples (2008-09). Residues of tricyclazole and carbendazim, analyzed from same locations, revealed pesticide residues as BDL (water samples (2009-2010). The residues of tricyclazole, propioconazole, chlorpyrifos, hexaconazole, pretilachlor, and λ-cyhalothrin were also found as BDL (water samples (<0.001-0.05 μg/L) (2010-2011).