WorldWideScience

Sample records for nonparametric empirical analysis

  1. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  2. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers

    Directory of Open Access Journals (Sweden)

    Stochl Jan

    2012-06-01

    Full Text Available Abstract Background Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Methods Scalability of data from 1 a cross-sectional health survey (the Scottish Health Education Population Survey and 2 a general population birth cohort study (the National Child Development Study illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. Results and conclusions After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items we show that all items from the 12-item General Health Questionnaire (GHQ-12 – when binary scored – were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech’s “well-being” and “distress” clinical scales. An illustration of ordinal item analysis

  3. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers.

    Science.gov (United States)

    Stochl, Jan; Jones, Peter B; Croudace, Tim J

    2012-06-11

    Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related) Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Scalability of data from 1) a cross-sectional health survey (the Scottish Health Education Population Survey) and 2) a general population birth cohort study (the National Child Development Study) illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items) we show that all items from the 12-item General Health Questionnaire (GHQ-12)--when binary scored--were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech's "well-being" and "distress" clinical scales). An illustration of ordinal item analysis confirmed that all 14 positively worded items of the Warwick-Edinburgh Mental

  4. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  5. Nonparametric Bayes analysis of social science data

    Science.gov (United States)

    Kunihama, Tsuyoshi

    Social science data often contain complex characteristics that standard statistical methods fail to capture. Social surveys assign many questions to respondents, which often consist of mixed-scale variables. Each of the variables can follow a complex distribution outside parametric families and associations among variables may have more complicated structures than standard linear dependence. Therefore, it is not straightforward to develop a statistical model which can approximate structures well in the social science data. In addition, many social surveys have collected data over time and therefore we need to incorporate dynamic dependence into the models. Also, it is standard to observe massive number of missing values in the social science data. To address these challenging problems, this thesis develops flexible nonparametric Bayesian methods for the analysis of social science data. Chapter 1 briefly explains backgrounds and motivations of the projects in the following chapters. Chapter 2 develops a nonparametric Bayesian modeling of temporal dependence in large sparse contingency tables, relying on a probabilistic factorization of the joint pmf. Chapter 3 proposes nonparametric Bayes inference on conditional independence with conditional mutual information used as a measure of the strength of conditional dependence. Chapter 4 proposes a novel Bayesian density estimation method in social surveys with complex designs where there is a gap between sample and population. We correct for the bias by adjusting mixture weights in Bayesian mixture models. Chapter 5 develops a nonparametric model for mixed-scale longitudinal surveys, in which various types of variables can be induced through latent continuous variables and dynamic latent factors lead to flexibly time-varying associations among variables.

  6. Local Component Analysis for Nonparametric Bayes Classifier

    CERN Document Server

    Khademi, Mahmoud; safayani, Meharn

    2010-01-01

    The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with co...

  7. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  8. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  9. A Nonparametric Analogy of Analysis of Covariance

    Science.gov (United States)

    Burnett, Thomas D.; Barr, Donald R.

    1977-01-01

    A nonparametric test of the hypothesis of no treatment effect is suggested for a situation where measures of the severity of the condition treated can be obtained and ranked both pre- and post-treatment. The test allows the pre-treatment rank to be used as a concomitant variable. (Author/JKS)

  10. Lottery spending: a non-parametric analysis.

    Science.gov (United States)

    Garibaldi, Skip; Frisoli, Kayla; Ke, Li; Lim, Melody

    2015-01-01

    We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  11. Lottery spending: a non-parametric analysis.

    Directory of Open Access Journals (Sweden)

    Skip Garibaldi

    Full Text Available We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  12. Poverty and life cycle effects: A nonparametric analysis for Germany

    OpenAIRE

    Stich, Andreas

    1996-01-01

    Most empirical studies on poverty consider the extent of poverty either for the entire society or for separate groups like elderly people.However, these papers do not show what the situation looks like for persons of a certain age. In this paper poverty measures depending on age are derived using the joint density of income and age. The density is nonparametrically estimated by weighted Gaussian kernel density estimation. Applying the conditional density of income to several poverty measures ...

  13. Investigating the cultural patterns of corruption: A nonparametric analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2011-01-01

    By using a sample of 77 countries our analysis applies several nonparametric techniques in order to reveal the link between national culture and corruption. Based on Hofstede’s cultural dimensions and the corruption perception index, the results reveal that countries with higher levels of corruption tend to have higher power distance and collectivism values in their society.

  14. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... to avoid this problem. The main objective is to investigate the applicability of the nonparametric kernel regression method in applied production analysis. The focus of the empirical analyses included in this thesis is the agricultural sector in Poland. Data on Polish farms are used to investigate...... practically and politically relevant problems and to illustrate how nonparametric regression methods can be used in applied microeconomic production analysis both in panel data and cross-section data settings. The thesis consists of four papers. The first paper addresses problems of parametric...

  15. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  16. Nonparametric inference procedures for multistate life table analysis.

    Science.gov (United States)

    Dow, M M

    1985-01-01

    Recent generalizations of the classical single state life table procedures to the multistate case provide the means to analyze simultaneously the mobility and mortality experience of 1 or more cohorts. This paper examines fairly general nonparametric combinatorial matrix procedures, known as quadratic assignment, as an analysis technic of various transitional patterns commonly generated by cohorts over the life cycle course. To some degree, the output from a multistate life table analysis suggests inference procedures. In his discussion of multstate life table construction features, the author focuses on the matrix formulation of the problem. He then presents several examples of the proposed nonparametric procedures. Data for the mobility and life expectancies at birth matrices come from the 458 member Cayo Santiago rhesus monkey colony. The author's matrix combinatorial approach to hypotheses testing may prove to be a useful inferential strategy in several multidimensional demographic areas.

  17. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  18. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    -Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  19. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  20. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  1. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  2. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  3. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  4. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...... of stochasticity associated with Lithuanian family farm performance. The former technique showed that the farms differed in terms of the mean values and variance of the efficiency scores over time with some clear patterns prevailing throughout the whole research period. The fuzzy Free Disposal Hull showed...

  5. 非参数回归模型变点两步估计法及实证分析%Two-step change point estimation in nonparametric regression model and the empirical analysis

    Institute of Scientific and Technical Information of China (English)

    赵文芝; 夏志明; 贺飞跃

    2016-01-01

    The two-step estimators for change point in nonparametric regression are proposed.In the first step,an initial estimator is obtained by local linear smoothing method.In the second step,the fi-nal estimator is obtained by CUSUM method on a closed neighborhood of initial estimator.It is found through a simulation study that the proposed estimator is efficient.The estimator for j ump size is also obtained.Further more,experimental results that using historical data on Nile river discharges,ex-change rate data of USD against RMB and global temperature data for the northern hemisphere show that the proposed method is also practical in applications.%针对非参数回归模型变点问题,给出了变点的两步估计方法。第一步,用局部线性方法给出变点的初始估计量;第二步,在初始估计量的邻域内,用 CUSUM方法给出变点的最终估计量,同时获得了变点跃度的估计量。蒙特卡罗随机模拟结果表明了此方法的有效性。最后以尼罗河流量数据,美元兑换人民币汇率数据以及北半球月平均气温数据为例进行分析,结果说明此方法有实际应用价值。

  6. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  7. Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    2009-01-01

    In this paper a nonparametric variance ratio testing approach is proposed for determining the number of cointegrating relations in fractionally integrated systems. The test statistic is easily calculated without prior knowledge of the integration order of the data, the strength of the cointegrating...

  8. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...... but that this dependence vanishes after 2-3 years....

  9. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move...

  10. ANALYSIS OF TIED DATA: AN ALTERNATIVE NON-PARAMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    I. C. A. OYEKA

    2012-02-01

    Full Text Available This paper presents a non-parametric statistical method of analyzing two-sample data that makes provision for the possibility of ties in the data. A test statistic is developed and shown to be free of the effect of any possible ties in the data. An illustrative example is provided and the method is shown to compare favourably with its competitor; the Mann-Whitney test and is more powerful than the latter when there are ties.

  11. A Bayesian nonparametric method for prediction in EST analysis

    Directory of Open Access Journals (Sweden)

    Prünster Igor

    2007-09-01

    Full Text Available Abstract Background Expressed sequence tags (ESTs analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b the number of new unique genes to be observed in a future sample; c the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample.

  12. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  13. Local kernel nonparametric discriminant analysis for adaptive extraction of complex structures

    Science.gov (United States)

    Li, Quanbao; Wei, Fajie; Zhou, Shenghan

    2017-05-01

    The linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.

  14. Comparison of Rank Analysis of Covariance and Nonparametric Randomized Blocks Analysis.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    The relative power of three possible experimental designs under the condition that data is to be analyzed by nonparametric techniques; the comparison of the power of each nonparametric technique to its parametric analogue; and the comparison of relative powers using nonparametric and parametric techniques are discussed. The three nonparametric…

  15. Spline Nonparametric Regression Analysis of Stress-Strain Curve of Confined Concrete

    Directory of Open Access Journals (Sweden)

    Tavio Tavio

    2008-01-01

    Full Text Available Due to enormous uncertainties in confinement models associated with the maximum compressive strength and ductility of concrete confined by rectilinear ties, the implementation of spline nonparametric regression analysis is proposed herein as an alternative approach. The statistical evaluation is carried out based on 128 large-scale column specimens of either normal-or high-strength concrete tested under uniaxial compression. The main advantage of this kind of analysis is that it can be applied when the trend of relation between predictor and response variables are not obvious. The error in the analysis can, therefore, be minimized so that it does not depend on the assumption of a particular shape of the curve. This provides higher flexibility in the application. The results of the statistical analysis indicates that the stress-strain curves of confined concrete obtained from the spline nonparametric regression analysis proves to be in good agreement with the experimental curves available in literatures

  16. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  17. Multilevel Latent Class Analysis: Parametric and Nonparametric Models

    Science.gov (United States)

    Finch, W. Holmes; French, Brian F.

    2014-01-01

    Latent class analysis is an analytic technique often used in educational and psychological research to identify meaningful groups of individuals within a larger heterogeneous population based on a set of variables. This technique is flexible, encompassing not only a static set of variables but also longitudinal data in the form of growth mixture…

  18. Applications of non-parametric statistics and analysis of variance on sample variances

    Science.gov (United States)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  19. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... function. However, the a priori specification of a functional form involves the risk of choosing one that is not similar to the “true” but unknown relationship between the regressors and the dependent variable. This problem, known as parametric misspecification, can result in biased parameter estimates...... and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...

  20. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    the Multi-Directional Efficiency Analysis approach, (iii) to account for uncertainties via the use of probabilistic and fuzzy measures. Therefore, the thesis encompass six papers dedicated to (the combinations of) these objectives. One of the main contributions of this thesis is a number of extensions...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  1. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Alkier Gildberg, Frederik; Bradley, Stephen; Tingleff, Ellen Boldrup

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the link...... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize....../develop for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....

  2. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  3. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that e

  4. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that e

  5. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data.

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-04-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in their data. To verify this conjecture, we compare the fit of these models to the Social Problem Solving Inventory-Revised, whose scales were designed to be unidimensional. A calibration and a cross-validation sample of new observations were used. We also included the following parametric models in the comparison: Bock's nominal model, Masters' partial credit model, and Thissen and Steinberg's extension of the latter. All models were estimated using full information maximum likelihood. We also included in the comparison a normal ogive model version of Samejima's model estimated using limited information estimation. We found that for all scales Samejima's model outperformed all other parametric IRT models in both samples, regardless of the estimation method employed. The non-parametric model outperformed all parametric models in the calibration sample. However, the graded model outperformed MFS in the cross-validation sample in some of the scales. We advocate employing the graded model estimated using limited information methods in modeling Likert-type data, as these methods are more versatile than full information methods to capture the multidimensionality that is generally present in personality data.

  6. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  7. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  8. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...

  9. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    Science.gov (United States)

    2010-04-01

    OF EACH CELL ARE RESULTS OF KSVD AND BPFA, RESPECTIVELY. σ C.man House Peppers Lena Barbara Boats F.print Couple Hill 5 37.87 39.37 37.78 38.60 38.08...INTERPOLATION PSNR RESULTS, USING PATCH SIZE 8× 8. BOTTOM: BPFA RGB IMAGE INTERPOLATION PSNR RESULTS, USING PATCH SIZE 7× 7. data ratio C.man House Peppers Lena...of subspaces. IEEE Trans. Inform. Theory, 2009. [16] T. Ferguson . A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1:209–230

  10. An adaptive nonparametric method in benchmark analysis for bioassay and environmental studies.

    Science.gov (United States)

    Bhattacharya, Rabi; Lin, Lizhen

    2010-12-01

    We present a novel nonparametric method for bioassay and benchmark analysis in risk assessment, which averages isotonic MLEs based on disjoint subgroups of dosages. The asymptotic theory for the methodology is derived, showing that the MISEs (mean integrated squared error) of the estimates of both the dose-response curve F and its inverse F(-1) achieve the optimal rate O(N(-4/5)). Also, we compute the asymptotic distribution of the estimate ζ~p of the effective dosage ζ(p) = F(-1) (p) which is shown to have an optimally small asymptotic variance.

  11. Evolution of the CMB Power Spectrum Across WMAP Data Releases: A Nonparametric Analysis

    CERN Document Server

    Aghamousa, Amir; Souradeep, Tarun

    2011-01-01

    We present a comparative analysis of the WMAP 1-, 3-, 5-, and 7-year data releases for the CMB angular power spectrum, with respect to the following three key questions: (a) How well is the angular power spectrum determined by the data alone? (b) How well is the Lambda-CDM model supported by a model-independent, data-driven analysis? (c) What are the realistic uncertainties on peak/dip locations and heights? Our analysis is based on a nonparametric function estimation methodology [1,2]. Our results show that the height of the power spectrum is well determined by data alone for multipole index l approximately less than 600 (1-year), 800 (3-year), and 900 (5- and 7-year data realizations). We also show that parametric fits based on the Lambda-CDM model are remarkably close to our nonparametric fit in l-regions where the data are sufficiently precise. A contrasting example is provided by an H-Lambda-CDM model: As the data become precise with successive data realizations, the H-Lambda-CDM angular power spectrum g...

  12. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  13. Bayesian nonparametric regression analysis of data with random effects covariates from longitudinal measurements.

    Science.gov (United States)

    Ryu, Duchwan; Li, Erning; Mallick, Bani K

    2011-06-01

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves.

  14. Towards Nonstationary, Nonparametric Independent Process Analysis with Unknown Source Component Dimensions

    CERN Document Server

    Szabo, Zoltan

    2010-01-01

    The goal of this paper is to extend independent subspace analysis (ISA) to the case of (i) nonparametric, not strictly stationary source dynamics and (ii) unknown source component dimensions. We make use of functional autoregressive (fAR) processes to model the temporal evolution of the hidden sources. An extension of the ISA separation principle--which states that the ISA problem can be solved by traditional independent component analysis (ICA) and clustering of the ICA elements--is derived for the solution of the defined fAR independent process analysis task (fAR-IPA): applying fAR identification we reduce the problem to ISA. A local averaging approach, the Nadaraya-Watson kernel regression technique is adapted to obtain strongly consistent fAR estimation. We extend the Amari-index to different dimensional components and illustrate the efficiency of the fAR-IPA approach by numerical examples.

  15. A Level Set Analysis and A Nonparametric Regression on S&P 500 Daily Return

    Directory of Open Access Journals (Sweden)

    Yipeng Yang

    2016-02-01

    Full Text Available In this paper, a level set analysis is proposed which aims to analyze the S&P 500 return with a certain magnitude. It is found that the process of large jumps/drops of return tend to have negative serial correlation, and volatility clustering phenomenon can be easily seen. Then, a nonparametric analysis is performed and new patterns are discovered. An ARCH model is constructed based on the patterns we discovered and it is capable of manifesting the volatility skew in option pricing. A comparison of our model with the GARCH(1,1 model is carried out. The explanation of the validity on our model through prospect theory is provided, and, as a novelty, we linked the volatility skew phenomenon to the prospect theory in behavioral finance.

  16. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    Science.gov (United States)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  17. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Science.gov (United States)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  18. Sparse Empirical Bayes Analysis (SEBA)

    CERN Document Server

    Bochkina, Natalia

    2009-01-01

    We consider a joint processing of $n$ independent sparse regression problems. Each is based on a sample $(y_{i1},x_{i1})...,(y_{im},x_{im})$ of $m$ \\iid observations from $y_{i1}=x_{i1}\\t\\beta_i+\\eps_{i1}$, $y_{i1}\\in \\R$, $x_{i 1}\\in\\R^p$, $i=1,...,n$, and $\\eps_{i1}\\dist N(0,\\sig^2)$, say. $p$ is large enough so that the empirical risk minimizer is not consistent. We consider three possible extensions of the lasso estimator to deal with this problem, the lassoes, the group lasso and the RING lasso, each utilizing a different assumption how these problems are related. For each estimator we give a Bayesian interpretation, and we present both persistency analysis and non-asymptotic error bounds based on restricted eigenvalue - type assumptions.

  19. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  20. Trend Analysis of Golestan's Rivers Discharges Using Parametric and Non-parametric Methods

    Science.gov (United States)

    Mosaedi, Abolfazl; Kouhestani, Nasrin

    2010-05-01

    One of the major problems in human life is climate changes and its problems. Climate changes will cause changes in rivers discharges. The aim of this research is to investigate the trend analysis of seasonal and yearly rivers discharges of Golestan province (Iran). In this research four trend analysis method including, conjunction point, linear regression, Wald-Wolfowitz and Mann-Kendall, for analyzing of river discharges in seasonal and annual periods in significant level of 95% and 99% were applied. First, daily discharge data of 12 hydrometrics stations with a length of 42 years (1965-2007) were selected, after some common statistical tests such as, homogeneity test (by applying G-B and M-W tests), the four mentioned trends analysis tests were applied. Results show that in all stations, for summer data time series, there are decreasing trends with a significant level of 99% according to Mann-Kendall (M-K) test. For autumn time series data, all four methods have similar results. For other periods, the results of these four tests were more or less similar together. While, for some stations the results of tests were different. Keywords: Trend Analysis, Discharge, Non-parametric methods, Wald-Wolfowitz, The Mann-Kendall test, Golestan Province.

  1. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  2. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb-Douglas and......We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs...... rejects both the Cobb-Douglas and the Translog functional form, while a recently developed nonparametric kernel regression method with a fully nonparametric panel data specification delivers plausible results. On average, the nonparametric regression results are similar to results that are obtained from...

  3. APPLICATION OF PARAMETRIC AND NON-PARAMETRIC BENCHMARKING METHODS IN COST EFFICIENCY ANALYSIS OF THE ELECTRICITY DISTRIBUTION SECTOR

    Directory of Open Access Journals (Sweden)

    Andrea Furková

    2007-06-01

    Full Text Available This paper explores the aplication of parametric and non-parametric benchmarking methods in measuring cost efficiency of Slovak and Czech electricity distribution companies. We compare the relative cost efficiency of Slovak and Czech distribution companies using two benchmarking methods: the non-parametric Data Envelopment Analysis (DEA and the Stochastic Frontier Analysis (SFA as the parametric approach. The first part of analysis was based on DEA models. Traditional cross-section CCR and BCC model were modified to cost efficiency estimation. In further analysis we focus on two versions of stochastic frontier cost functioin using panel data: MLE model and GLS model. These models have been applied to an unbalanced panel of 11 (Slovakia 3 and Czech Republic 8 regional electricity distribution utilities over a period from 2000 to 2004. The differences in estimated scores, parameters and ranking of utilities were analyzed. We observed significant differences between parametric methods and DEA approach.

  4. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  5. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  6. Analysis of intravenous glucose tolerance test data using parametric and nonparametric modeling: application to a population at risk for diabetes.

    Science.gov (United States)

    Marmarelis, Vasilis Z; Shin, Dae C; Zhang, Yaping; Kautzky-Willer, Alexandra; Pacini, Giovanni; D'Argenio, David Z

    2013-07-01

    Modeling studies of the insulin-glucose relationship have mainly utilized parametric models, most notably the minimal model (MM) of glucose disappearance. This article presents results from the comparative analysis of the parametric MM and a nonparametric Laguerre based Volterra Model (LVM) applied to the analysis of insulin modified (IM) intravenous glucose tolerance test (IVGTT) data from a clinical study of gestational diabetes mellitus (GDM). An IM IVGTT study was performed 8 to 10 weeks postpartum in 125 women who were diagnosed with GDM during their pregnancy [population at risk of developing diabetes (PRD)] and in 39 control women with normal pregnancies (control subjects). The measured plasma glucose and insulin from the IM IVGTT in each group were analyzed via a population analysis approach to estimate the insulin sensitivity parameter of the parametric MM. In the nonparametric LVM analysis, the glucose and insulin data were used to calculate the first-order kernel, from which a diagnostic scalar index representing the integrated effect of insulin on glucose was derived. Both the parametric MM and nonparametric LVM describe the glucose concentration data in each group with good fidelity, with an improved measured versus predicted r² value for the LVM of 0.99 versus 0.97 for the MM analysis in the PRD. However, application of the respective diagnostic indices of the two methods does result in a different classification of 20% of the individuals in the PRD. It was found that the data based nonparametric LVM revealed additional insights about the manner in which infused insulin affects blood glucose concentration. © 2013 Diabetes Technology Society.

  7. Semi- and Nonparametric ARCH Processes

    Directory of Open Access Journals (Sweden)

    Oliver B. Linton

    2011-01-01

    Full Text Available ARCH/GARCH modelling has been successfully applied in empirical finance for many years. This paper surveys the semiparametric and nonparametric methods in univariate and multivariate ARCH/GARCH models. First, we introduce some specific semiparametric models and investigate the semiparametric and nonparametrics estimation techniques applied to: the error density, the functional form of the volatility function, the relationship between mean and variance, long memory processes, locally stationary processes, continuous time processes and multivariate models. The second part of the paper is about the general properties of such processes, including stationary conditions, ergodic conditions and mixing conditions. The last part is on the estimation methods in ARCH/GARCH processes.

  8. An exact predictive recursion for Bayesian nonparametric analysis of incomplete data

    OpenAIRE

    Garibaldi, Ubaldo; Viarengo, Paolo

    2010-01-01

    This paper presents a new derivation of nonparametric distribution estimation with right-censored data. It is based on an extension of the predictive inferences to compound evidence. The estimate is recursive and exact, and no stochastic approximation is needed: it simply requires that the censored data are processed in decreasing order. Only in this case the recursion provides exact posterior predictive distributions for subsequent samples under a Dirichlet process prior. The resulting estim...

  9. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  10. CURRENT STATUS OF NONPARAMETRIC STATISTICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-02-01

    Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems

  11. Empirical Bayes analysis of single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ickstadt Katja

    2008-03-01

    Full Text Available Abstract Background An important goal of whole-genome studies concerned with single nucleotide polymorphisms (SNPs is the identification of SNPs associated with a covariate of interest such as the case-control status or the type of cancer. Since these studies often comprise the genotypes of hundreds of thousands of SNPs, methods are required that can cope with the corresponding multiple testing problem. For the analysis of gene expression data, approaches such as the empirical Bayes analysis of microarrays have been developed particularly for the detection of genes associated with the response. However, the empirical Bayes analysis of microarrays has only been suggested for binary responses when considering expression values, i.e. continuous predictors. Results In this paper, we propose a modification of this empirical Bayes analysis that can be used to analyze high-dimensional categorical SNP data. This approach along with a generalized version of the original empirical Bayes method are available in the R package siggenes version 1.10.0 and later that can be downloaded from http://www.bioconductor.org. Conclusion As applications to two subsets of the HapMap data show, the empirical Bayes analysis of microarrays cannot only be used to analyze continuous gene expression data, but also be applied to categorical SNP data, where the response is not restricted to be binary. In association studies in which typically several ten to a few hundred SNPs are considered, our approach can furthermore be employed to test interactions of SNPs. Moreover, the posterior probabilities resulting from the empirical Bayes analysis of (prespecified interactions/genotypes can also be used to quantify the importance of these interactions.

  12. Nonparametric variance estimation in the analysis of microarray data: a measurement error approach.

    Science.gov (United States)

    Carroll, Raymond J; Wang, Yuedong

    2008-01-01

    This article investigates the effects of measurement error on the estimation of nonparametric variance functions. We show that either ignoring measurement error or direct application of the simulation extrapolation, SIMEX, method leads to inconsistent estimators. Nevertheless, the direct SIMEX method can reduce bias relative to a naive estimator. We further propose a permutation SIMEX method which leads to consistent estimators in theory. The performance of both SIMEX methods depends on approximations to the exact extrapolants. Simulations show that both SIMEX methods perform better than ignoring measurement error. The methodology is illustrated using microarray data from colon cancer patients.

  13. Non-parametric trend analysis of water quality data of rivers in Kansas

    Science.gov (United States)

    Yu, Y.-S.; Zou, S.; Whittemore, D.

    1993-01-01

    Surface water quality data for 15 sampling stations in the Arkansas, Verdigris, Neosho, and Walnut river basins inside the state of Kansas were analyzed to detect trends (or lack of trends) in 17 major constituents by using four different non-parametric methods. The results show that concentrations of specific conductance, total dissolved solids, calcium, total hardness, sodium, potassium, alkalinity, sulfate, chloride, total phosphorus, ammonia plus organic nitrogen, and suspended sediment generally have downward trends. Some of the downward trends are related to increases in discharge, while others could be caused by decreases in pollution sources. Homogeneity tests show that both station-wide trends and basinwide trends are non-homogeneous. ?? 1993.

  14. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population.

  15. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  16. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  17. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both qu

  18. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  19. Nonparametric analysis of competing risks data with event category missing at random.

    Science.gov (United States)

    Gouskova, Natalia A; Lin, Feng-Chang; Fine, Jason P

    2017-03-01

    In competing risks setup, the data for each subject consist of the event time, censoring indicator, and event category. However, sometimes the information about the event category can be missing, as, for example, in a case when the date of death is known but the cause of death is not available. In such situations, treating subjects with missing event category as censored leads to the underestimation of the hazard functions. We suggest nonparametric estimators for the cumulative cause-specific hazards and the cumulative incidence functions which use the Nadaraya-Watson estimator to obtain the contribution of an event with missing category to each of the cause-specific hazards. We derive the propertied of the proposed estimators. Optimal bandwidth is determined, which minimizes the mean integrated squared errors of the proposed estimators over time. The methodology is illustrated using data on lung infections in patients from the United States Cystic Fibrosis Foundation Patient Registry. © 2016, The International Biometric Society.

  20. Nonparametric Signal Extraction and Measurement Error in the Analysis of Electroencephalographic Activity During Sleep.

    Science.gov (United States)

    Crainiceanu, Ciprian M; Caffo, Brian S; Di, Chong-Zhi; Punjabi, Naresh M

    2009-06-01

    We introduce methods for signal and associated variability estimation based on hierarchical nonparametric smoothing with application to the Sleep Heart Health Study (SHHS). SHHS is the largest electroencephalographic (EEG) collection of sleep-related data, which contains, at each visit, two quasi-continuous EEG signals for each subject. The signal features extracted from EEG data are then used in second level analyses to investigate the relation between health, behavioral, or biometric outcomes and sleep. Using subject specific signals estimated with known variability in a second level regression becomes a nonstandard measurement error problem. We propose and implement methods that take into account cross-sectional and longitudinal measurement error. The research presented here forms the basis for EEG signal processing for the SHHS.

  1. Nonparametric analysis of the time structure of seismicity in a geographic region

    Directory of Open Access Journals (Sweden)

    A. Quintela-del-Río

    2002-06-01

    Full Text Available As an alternative to traditional parametric approaches, we suggest nonparametric methods for analyzing temporal data on earthquake occurrences. In particular, the kernel method for estimating the hazard function and the intensity function are presented. One novelty of our approaches is that we take into account the possible dependence of the data to estimate the distribution of time intervals between earthquakes, which has not been considered in most statistics studies on seismicity. Kernel estimation of hazard function has been used to study the occurrence process of cluster centers (main shocks. Kernel intensity estimation, on the other hand, has helped to describe the occurrence process of cluster members (aftershocks. Similar studies in two geographic areas of Spain (Granada and Galicia have been carried out to illustrate the estimation methods suggested.

  2. Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates

    Directory of Open Access Journals (Sweden)

    Saeed Banihashemi

    2015-12-01

    Full Text Available In line with the growing global trend toward energy efficiency in buildings, this paper aims to first; investigate the energy performance of double-glazed windows in different climates and second; analyze the most dominant used parametric and non-parametric tests in dimension reduction for simulating this component. A four-story building representing the conventional type of residential apartments for four climates of cold, temperate, hot-arid and hot-humid was selected for simulation. 10 variables of U-factor, SHGC, emissivity, visible transmittance, monthly average dry bulb temperature, monthly average percent humidity, monthly average wind speed, monthly average direct solar radiation, monthly average diffuse solar radiation and orientation constituted the parameters considered in the calculation of cooling and heating loads of the case. Design of Experiment and Principal Component Analysis methods were applied to find the most significant factors and reduction dimension of initial variables. It was observed that in two climates of temperate and hot-arid, using double glazed windows was beneficial in both cold and hot months whereas in cold and hot-humid climates where heating and cooling loads are dominant respectively, they were advantageous in only those dominant months. Furthermore, an inconsistency was revealed between parametric and non-parametric tests in terms of identifying the most significant variables.

  3. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  4. Bootstrap Estimation for Nonparametric Efficiency Estimates

    OpenAIRE

    1995-01-01

    This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...

  5. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  6. Nonparametric randomization-based covariate adjustment for stratified analysis of time-to-event or dichotomous outcomes.

    Science.gov (United States)

    Hussey, Michael A; Koch, Gary G; Preisser, John S; Saville, Benjamin R

    2016-01-01

    Time-to-event or dichotomous outcomes in randomized clinical trials often have analyses using the Cox proportional hazards model or conditional logistic regression, respectively, to obtain covariate-adjusted log hazard (or odds) ratios. Nonparametric Randomization-Based Analysis of Covariance (NPANCOVA) can be applied to unadjusted log hazard (or odds) ratios estimated from a model containing treatment as the only explanatory variable. These adjusted estimates are stratified population-averaged treatment effects and only require a valid randomization to the two treatment groups and avoid key modeling assumptions (e.g., proportional hazards in the case of a Cox model) for the adjustment variables. The methodology has application in the regulatory environment where such assumptions cannot be verified a priori. Application of the methodology is illustrated through three examples on real data from two randomized trials.

  7. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    CERN Document Server

    Ford, Eric B; Steffen, Jason H; Carter, Joshua A; Fressin, Francois; Holman, Matthew J; Lissauer, Jack J; Moorhead, Althea V; Morehead, Robert C; Ragozzine, Darin; Rowe, Jason F; Welsh, William F; Allen, Christopher; Batalha, Natalie M; Borucki, William J; Bryson, Stephen T; Buchhave, Lars A; Burke, Christopher J; Caldwell, Douglas A; Charbonneau, David; Clarke, Bruce D; Cochran, William D; Désert, Jean-Michel; Endl, Michael; Everett, Mark E; Fischer, Debra A; Gautier, Thomas N; Gilliland, Ron L; Jenkins, Jon M; Haas, Michael R; Horch, Elliott; Howell, Steve B; Ibrahim, Khadeejah A; Isaacson, Howard; Koch, David G; Latham, David W; Li, Jie; Lucas, Philip; MacQueen, Phillip J; Marcy, Geoffrey W; McCauliff, Sean; Mullally, Fergal R; Quinn, Samuel N; Quintana, Elisa; Shporer, Avi; Still, Martin; Tenenbaum, Peter; Thompson, Susan E; Torres, Guillermo; Twicken, Joseph D; Wohler, Bill

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:...

  8. Applying a non-parametric efficiency analysis to measure conversion efficiency in Great Britain

    NARCIS (Netherlands)

    Binder, M.; Broekel, T.

    2011-01-01

    In the literature on Sen's capability approach, studies focusing on the empirical measurement of conversion factors are comparatively rare. We add to this field by adopting a measure of 'conversion efficiency' that captures the efficiency with which individuals convert their resources into achieved

  9. Applying a non-parametric efficiency analysis to measure conversion efficiency in Great Britain

    NARCIS (Netherlands)

    Binder, M.; Broekel, T.

    2011-01-01

    In the literature on Sen's capability approach, studies focusing on the empirical measurement of conversion factors are comparatively rare. We add to this field by adopting a measure of 'conversion efficiency' that captures the efficiency with which individuals convert their resources into achieved

  10. A Non-Parametric and Entropy Based Analysis of the Relationship between the VIX and S&P 500

    Directory of Open Access Journals (Sweden)

    Abhay K. Singh

    2013-10-01

    Full Text Available This paper features an analysis of the relationship between the S&P 500 Index and the VIX using daily data obtained from the CBOE website and SIRCA (The Securities Industry Research Centre of the Asia Pacific. We explore the relationship between the S&P 500 daily return series and a similar series for the VIX in terms of a long sample drawn from the CBOE from 1990 to mid 2011 and a set of returns from SIRCA’s TRTH datasets from March 2005 to-date. This shorter sample, which captures the behavior of the new VIX, introduced in 2003, is divided into four sub-samples which permit the exploration of the impact of the Global Financial Crisis. We apply a series of non-parametric based tests utilizing entropy based metrics. These suggest that the PDFs and CDFs of these two return distributions change shape in various subsample periods. The entropy and MI statistics suggest that the degree of uncertainty attached to these distributions changes through time and using the S&P 500 return as the dependent variable, that the amount of information obtained from the VIX changes with time and reaches a relative maximum in the most recent period from 2011 to 2012. The entropy based non-parametric tests of the equivalence of the two distributions and their symmetry all strongly reject their respective nulls. The results suggest that parametric techniques do not adequately capture the complexities displayed in the behavior of these series. This has practical implications for hedging utilizing derivatives written on the VIX.

  11. Productivity improvement in Korean rice farming: parametric and non-parametric analysis

    OpenAIRE

    Kwon, Oh Sang; Lee, Hyunok

    2004-01-01

    The published empirical literature on frontier production functions is dominated by two broadly defined estimation approaches – parametric and non‐parametric. Using panel data on Korean rice production, parametric and non‐parametric production frontiers are estimated and compared with estimated productivity. The non‐parametric approach employs two alternative measures based on the Malmquist index and the Luenberger indicator, while the parametric approach is closely related to the time‐varian...

  12. An Empirical Analysis of Humanitarian Warehouse Locations

    Directory of Open Access Journals (Sweden)

    Sander de Leeuw

    2016-06-01

    Full Text Available The purpose of this paper is to empirically verify characteristics of current warehouse locations of humanitarian organizations (based on public information and to relate those to the model developed by Richardson, de Leeuw and Dullaert (2016. This paper is based on desk research. Public data such as (annual reports and databases are used to determine the features of the location in empirical terms. We find that a significant proportion of our sample co-locates their products at UNHRD premises. This suggests that organizations prefer to cluster their warehouse activities, particularly when there is no fee involved for using the warehouse (as is the case in the UNHRD network. The geographic map of the current warehouses, together with the quantified location factors, provides an overview of the current warehouse locations. We found that the characteristics of the current warehouse locations are aligned with literature on location selection factors. Current location can be characterized by infrastructure characteristics (in particular closeness to airport and safety concerns and by the low occurrence of disasters. Other factors that were considered by us but were not supported by empirical evidence were labor quality and availability as well as the political environment. In our study we were only able to use a limited sample of warehouses. We also focused our research on countries where two or more organizations have their warehouses located. We did not account for warehouse sizes or the kinds of products stored in our analysis.

  13. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  14. GPU-accelerated nonparametric kinetic analysis of DCE-MRI data from glioblastoma patients treated with bevacizumab.

    Science.gov (United States)

    Hsu, Yu-Han H; Ferl, Gregory Z; Ng, Chee M

    2013-05-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is often used to examine vascular function in malignant tumors and noninvasively monitor drug efficacy of antivascular therapies in clinical studies. However, complex numerical methods used to derive tumor physiological properties from DCE-MRI images can be time-consuming and computationally challenging. Recent advancement of computing technology in graphics processing unit (GPU) makes it possible to build an energy-efficient and high-power parallel computing platform for solving complex numerical problems. This study develops the first reported fast GPU-based method for nonparametric kinetic analysis of DCE-MRI data using clinical scans of glioblastoma patients treated with bevacizumab (Avastin®). In the method, contrast agent concentration-time profiles in arterial blood and tumor tissue are smoothed using a robust kernel-based regression algorithm in order to remove artifacts due to patient motion and then deconvolved to produce the impulse response function (IRF). The area under the curve (AUC) and mean residence time (MRT) of the IRF are calculated using statistical moment analysis, and two tumor physiological properties that relate to vascular permeability, volume transfer constant between blood plasma and extravascular extracellular space (K(trans)) and fractional interstitial volume (ve) are estimated using the approximations AUC/MRT and AUC. The most significant feature in this method is the use of GPU-computing to analyze data from more than 60,000 voxels in each DCE-MRI image in parallel fashion. All analysis steps have been automated in a single program script that requires only blood and tumor data as the sole input. The GPU-accelerated method produces K(trans) and ve estimates that are comparable to results from previous studies but reduces computational time by more than 80-fold compared to a previously reported central processing unit-based nonparametric method. Furthermore, it is at

  15. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  16. An Empirical Analysis of the Budget Deficit

    Directory of Open Access Journals (Sweden)

    Ioan Talpos

    2007-11-01

    Full Text Available Economic policies and, particularly, fiscal policies are not designed and implemented in an “empty space”: the structural characteristics of the economic systems, the institutional architecture of societies, the cultural paradigm and the power relations between different social groups define the borders of these policies. This paper tries to deal with these borders, to describe their nature and the implications of their existence to the fiscal policies’ quality and impact at a theoretical level as well as at an empirical one. The main results of the proposed analysis support the ideas that the mentioned variables matters both for the social mandate entrusted by the society to the state and thus to the role and functions of the state and for the economic growth as a support of the resources collected at distributed by the public authorities.

  17. EMPIRICAL ANALYSIS OF SEASONALITY PATTERNS IN TOURISM

    Directory of Open Access Journals (Sweden)

    Biljana Petrevska

    2013-04-01

    Full Text Available The paper makes an attempt empirically to investigate the presence of seasonality patterns in tourism. For that purpose, the case of Macedonia is elaborated by introducing data referring tourist arrivals for the period 1992- 2012. The analysis is based upon employment of the Gini coefficient, as one of the most commonly applied indicators for measuring and expressing inequalities caused by temporary disorders. The computed data reject the research hypothesis and highlights new facts regarding seasonality in tourism demand in Macedonia. Namely, the outcomes point to conclusion of absence of seasonality i.e. tourism flow concentration is not significant to tourism development. Hence, this study underlines that the up-to-date modest tourism results must not be addressed to seasonality as strong and limiting factor for tourism development in Macedonia, since there is no such.

  18. Non-parametric analysis of infrared spectra for recognition of glass and glass ceramic fragments in recycling plants.

    Science.gov (United States)

    Farcomeni, Alessio; Serranti, Silvia; Bonifazi, Giuseppe

    2008-01-01

    Glass ceramic detection in glass recycling plants represents a still unsolved problem, as glass ceramic material looks like normal glass and is usually detected only by specialized personnel. The presence of glass-like contaminants inside waste glass products, resulting from both industrial and differentiated urban waste collection, increases process production costs and reduces final product quality. In this paper an innovative approach for glass ceramic recognition, based on the non-parametric analysis of infrared spectra, is proposed and investigated. The work was specifically addressed to the spectral classification of glass and glass ceramic fragments collected in an actual recycling plant from three different production lines: flat glass, colored container-glass and white container-glass. The analyses, carried out in the near and mid-infrared (NIR-MIR) spectral field (1280-4480 nm), show that glass ceramic and glass fragments can be recognized by applying a wavelet transform, with a small classification error. Moreover, a method for selecting only a small subset of relevant wavelength ratios is suggested, allowing the conduct of a fast recognition of the two classes of materials. The results show how the proposed approach can be utilized to develop a classification engine to be integrated inside a hardware and software sorting architecture for fast "on-line" ceramic glass recognition and separation.

  19. nparLD: An R Software Package for the Nonparametric Analysis of Longitudinal Data in Factorial Experiments

    Directory of Open Access Journals (Sweden)

    Kimihiro Noguchi

    2012-09-01

    Full Text Available Longitudinal data from factorial experiments frequently arise in various fields of study, ranging from medicine and biology to public policy and sociology. In most practical situations, the distribution of observed data is unknown and there may exist a number of atypical measurements and outliers. Hence, use of parametric and semi-parametric procedures that impose restrictive distributional assumptions on observed longitudinal samples becomes questionable. This, in turn, has led to a substantial demand for statistical procedures that enable us to accurately and reliably analyze longitudinal measurements in factorial experiments with minimal conditions on available data, and robust nonparametric methodology offering such a possibility becomes of particular practical importance. In this article, we introduce a new R package nparLD which provides statisticians and researchers from other disciplines an easy and user-friendly access to the most up-to-date robust rank-based methods for the analysis of longitudinal data in factorial settings. We illustrate the implemented procedures by case studies from dentistry, biology, and medicine.

  20. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories.

  1. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2016-10-21

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease.

  2. The Efficiency Change of Italian Public Universities in the New Millennium: A Non-Parametric Analysis

    Science.gov (United States)

    Guccio, Calogero; Martorana, Marco Ferdinando; Mazza, Isidoro

    2017-01-01

    The paper assesses the evolution of efficiency of Italian public universities for the period 2000-2010. It aims at investigating whether their levels of efficiency showed signs of convergence, and if the well-known disparity between northern and southern regions decreased. For this purpose, we use a refinement of data envelopment analysis, namely…

  3. Non-parametric group-level statistics for source-resolved ERP analysis.

    Science.gov (United States)

    Lee, Clement; Miyakoshi, Makoto; Delorme, Arnaud; Cauwenberghs, Gert; Makeig, Scott

    2015-01-01

    We have developed a new statistical framework for group-level event-related potential (ERP) analysis in EEGLAB. The framework calculates the variance of scalp channel signals accounted for by the activity of homogeneous clusters of sources found by independent component analysis (ICA). When ICA data decomposition is performed on each subject's data separately, functionally equivalent ICs can be grouped into EEGLAB clusters. Here, we report a new addition (statPvaf) to the EEGLAB plug-in std_envtopo to enable inferential statistics on main effects and interactions in event related potentials (ERPs) of independent component (IC) processes at the group level. We demonstrate the use of the updated plug-in on simulated and actual EEG data.

  4. Nonparametric Bayesian Inference for Mean Residual Life Functions in Survival Analysis

    OpenAIRE

    Poynor, Valerie; Kottas, Athanasios

    2014-01-01

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life function which provides the expected remaining lifetime given that a subject has survived (i.e., is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the mean residual life function characterizes the sur...

  5. Empirical mode decomposition analysis for visual stylometry.

    Science.gov (United States)

    Hughes, James M; Mao, Dong; Rockmore, Daniel N; Wang, Yang; Wu, Qiang

    2012-11-01

    In this paper, we show how the tools of empirical mode decomposition (EMD) analysis can be applied to the problem of “visual stylometry,” generally defined as the development of quantitative tools for the measurement and comparisons of individual style in the visual arts. In particular, we introduce a new form of EMD analysis for images and show that it is possible to use its output as the basis for the construction of effective support vector machine (SVM)-based stylometric classifiers. We present the methodology and then test it on collections of two sets of digital captures of drawings: a set of authentic and well-known imitations of works attributed to the great Flemish artist Pieter Bruegel the Elder (1525-1569) and a set of works attributed to Dutch master Rembrandt van Rijn (1606-1669) and his pupils. Our positive results indicate that EMD-based methods may hold promise generally as a technique for visual stylometry.

  6. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  7. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    spline N−spline Fig. 3 Logistic regression 7 Approved for public release; distribution is unlimited. 5. Nonparametric QR Models Nonparametric linear ...stimulus and probability of response. The Generalized Linear Model approach does not make use of the limit distribution but allows arbitrary functional...7. Conclusions and Recommendations 18 8. References 19 Appendix A. The Linear Model 21 Appendix B. The Generalized Linear Model 33 Appendix C. B

  8. A Parcellation Based Nonparametric Algorithm for Independent Component Analysis with Application to fMRI Data

    Directory of Open Access Journals (Sweden)

    Shanshan eLi

    2016-01-01

    Full Text Available Independent Component analysis (ICA is a widely used technique for separating signals that have been mixed together. In this manuscript, we propose a novel ICA algorithm using density estimation and maximum likelihood, where the densities of the signals are estimated via p-spline based histogram smoothing and the mixing matrix is simultaneously estimated using an optimization algorithm. The algorithm is exceedingly simple, easy to implement and blind to the underlying distributions of the source signals. To relax the identically distributed assumption in the density function, a modified algorithm is proposed to allow for different density functions on different regions. The performance of the proposed algorithm is evaluated in different simulation settings. For illustration, the algorithm is applied to a research investigation with a large collection of resting state fMRI datasets. The results show that the algorithm successfully recovers the established brain networks.

  9. Adaptive Kernel Canonical Correlation Analysis Algorithms for Nonparametric Identification of Wiener and Hammerstein Systems

    Directory of Open Access Journals (Sweden)

    Ignacio Santamaría

    2008-04-01

    Full Text Available This paper treats the identification of nonlinear systems that consist of a cascade of a linear channel and a nonlinearity, such as the well-known Wiener and Hammerstein systems. In particular, we follow a supervised identification approach that simultaneously identifies both parts of the nonlinear system. Given the correct restrictions on the identification problem, we show how kernel canonical correlation analysis (KCCA emerges as the logical solution to this problem. We then extend the proposed identification algorithm to an adaptive version allowing to deal with time-varying systems. In order to avoid overfitting problems, we discuss and compare three possible regularization techniques for both the batch and the adaptive versions of the proposed algorithm. Simulations are included to demonstrate the effectiveness of the presented algorithm.

  10. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  11. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  12. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  13. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  14. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy.

    Science.gov (United States)

    Kong, Xiangrong; Mas, Valeria; Archer, Kellie J

    2008-02-26

    With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN) to those with normal functioning allograft. The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been reported to be relevant to renal diseases. Further study on the

  15. EMPIRICAL-NUMERICAL ANALYSIS OF HEADCUT MIGRATION

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Headcut migration is studied by using empirical and numerical modeling approaches. Empirical formulas for the headcut migration are established using available measurement data, which consider not only the flow strength but also the properties of soil. Numerical model for the headcut migration is proposed. The influences of dynamic pressure gradient, downward flow, and bed slope on sediment entrainment are considered. The local erosion patterns and migration speeds of headcut calculated by the numerical model agree reasonably well with observed data.

  16. Transit Timing Observations From Kepler: Ii. Confirmation of Two Multiplanet Systems via a Non-Parametric Correlation Analysis

    OpenAIRE

    Ford, Eric B.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew Jon; Lissauer, Jack J.; Moorhead, Althea V.; Morehead, Robert C.; Ragozzine, Darin; Rowe, Jason F.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Borucki, William J.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data se...

  17. Nonparametric Maize TFP Measurement Analysis and Countermeasures%玉米全要素生产率非参数测算分析及对策

    Institute of Scientific and Technical Information of China (English)

    曲会朋; 李宁; 田玉英

    2014-01-01

    玉米生产受到诸多投入要素的限制与影响,如种子、秧苗、劳动力、土地、农药、化肥、农膜、机械设备、畜力和其他物质投入等,提高这些因素的投入-产出效率水平对于促进玉米高效持续增产至关重要。为此,在对中美两国玉米生产成本与单产时序比较的基础上,利用基于DEA 的非参数前沿面效率分解方法对全国主要地区的玉米全要素生产效率问题进行了实证分析,从纵向时间序列和横向不同区域两个视角研究了我国玉米生产效率和生产资源配置的演化过程及区域对比特征。最后,有针对性地提出了促进我国玉米生产资源有效配置、提高玉米生产效率的相应措施与途径。%Maize production is affected by the restrictions and lots of inputs , such as seed seedlings, Labour, land, pesti-cide , chemical fertilizer , agricultural film , machinery and equipment , animal and other material input , improve the effi-ciency of input-output level of these factors is very important to promote efficient continuous corn production .Based on corn production cost compared with the yield time series in China and the United States , on the basis of the nonparamet-ric frontier efficiency based on DEA decomposition methods for main parts of the country's corn crop total factor produc-tivity question has carried on the empirical analysis , from the perspective of two different regions of the horizontal and vertical time sequence to study the evolution of China's corn production efficiency and resource allocation process and re-gional correlation characteristics .Finally , puts forward the promotion our country maize production resources effectively configuration , corresponding measures and way to improve efficiency of corn production .

  18. Compassion: An Evolutionary Analysis and Empirical Review

    Science.gov (United States)

    Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana

    2010-01-01

    What is compassion? And how did it evolve? In this review, we integrate 3 evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct…

  19. Nonparametric statistical methods

    CERN Document Server

    Hollander, Myles; Chicken, Eric

    2013-01-01

    Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics  Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit

  20. Typology of Empirical Attributes: Dissimilarity Linkage Analysis (DLA).

    Science.gov (United States)

    Dubin, Robert; Champoux, Joseph E.

    Dissimilarity Linkage Analysis (DLA) is an extremely simple procedure for developing a typology from empirical attributes that permits the clustering of entities. First the procedure develops a taxonomy of types from empirical attributes possessed by entities in the sample. Second, the procedure assigns entities to one, and only one, type in the…

  1. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  2. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  3. An Empirical Analysis of Perceptual Judgments

    Directory of Open Access Journals (Sweden)

    Nicholas Ray

    2014-12-01

    Full Text Available This paper is a defense of Reformed Empiricism, especially against those critics who take Reformed Empiricism to be a viable account of empirical rationality only if it avails itself of certain rationalist assumptions that are inconsistent with empiricism. I argue against three broad types of criticism that are found in the current literature, and propose a way of characterising Gupta’s constraints for any model of experience as analytic of empiricism itself, avoiding the charge by some (e.g. McDowell, Berker, and Schafer who think that the constraints are substantive.

  4. Coverage Accuracy of Confidence Intervals in Nonparametric Regression

    Institute of Scientific and Technical Information of China (English)

    Song-xi Chen; Yong-song Qin

    2003-01-01

    Point-wise confidence intervals for a nonparametric regression function with random design points are considered. The confidence intervals are those based on the traditional normal approximation and the empirical likelihood. Their coverage accuracy is assessed by developing the Edgeworth expansions for the coverage probabilities. It is shown that the empirical likelihood confidence intervals are Bartlett correctable.

  5. Empirical Analysis of the Online Rating Systems

    CERN Document Server

    Lu, Xin-Yi; Guo, Qiang; Liu, Jian-Guo

    2015-01-01

    This paper is to analyze the properties of evolving bipartite networks from four aspects, the growth of networks, the degree distribution, the popularity of objects and the diversity of user behaviours, leading a deep understanding on the empirical data. By empirical studies of data from the online bookstore Amazon and a question and answer site Stack Overflow, which are both rating bipartite networks, we could reveal the rules for the evolution of bipartite networks. These rules have significant meanings in practice for maintaining the operation of real systems and preparing for their future development. We find that the degree distribution of users follows a power law with an exponential cutoff. Also, according to the evolution of popularity for objects, we find that the large-degree objects tend to receive more new ratings than expected depending on their current degrees while the small-degree objects receive less ratings in terms of their degrees. Moreover, the user behaviours show such a trend that the l...

  6. Nonparametric Predictive Regression

    OpenAIRE

    Ioannis Kasparis; Elena Andreou; Phillips, Peter C.B.

    2012-01-01

    A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit...

  7. Nonparametric factorial analysis of daily weigh-in-motion traffic: implications for the ozone "weekend effect" in Southern California

    Science.gov (United States)

    Gao, Oliver H.; Holmén, Britt A.; Niemeier, Debbie A.

    The Ozone Weekend Effect (OWE) has become increasingly more frequent and widespread in southern California since the mid-1970s. Although a number of hypotheses have been suggested to explain the effect, there remains uncertainty associated with the root factors contributing to elevated weekend ozone concentrations. Targeting the time window of the 1997 Southern California Ozone Study (SCOS97), this paper examines traffic activity data for 14 vehicle classes at 27 weigh-in-motion (WIM) stations in southern California. Nonparametric factorial analyses of light-duty vehicle (LDV) and heavy-duty truck (HDT) traffic volumes indicate significant differences in daily volumes by day of week and between the weekly patterns of daily LDV and HDT volumes. Across WIM stations, the daily LDV volume was highest on Friday and decreased by 10% on weekends compared to that on midweek days. In contrast, daily HDT volumes showed dramatic weekend drops of 53% on Saturday and 64% on Sunday. As a result, LDV to HDT ratios increased by 145% on weekends. Nonparametric tests also suggest that weekly traffic patterns varied significantly between WIM stations located close to (central) and far from (peripheral) the Los Angeles Metro area. Weekend increases in LDV/HDT ratios were more pronounced at central WIM sites due to greater weekend declines of HDT relative to LDV traffic. The implications of these weekly traffic patterns for the OWE in southern California were investigated by estimating daily WIM traffic on-road running exhaust emissions of total organic gas (TOG) and oxides of nitrogen (NO x) using EMFAC2002 emission factors. The results support the California Air Resource Board's (CARB's) NO x reduction hypothesis that greater weekend NO x reductions relative to volatile organic compound (VOC) emissions, in combinations with the VOC-limited ozone system, contribute to the OWE observed in the region. The results from this study can be used to develop weekend on-road mobile emission

  8. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of temporal maximum autocorrelation factor analysis to global monthly mean values of 1996-1997 sea surface temperature (SST) and sea surface height (SSH) data. This type of analysis can be considered as an extension of traditional empirical orthogonal function...

  9. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  10. a Multivariate Downscaling Model for Nonparametric Simulation of Daily Flows

    Science.gov (United States)

    Molina, J. M.; Ramirez, J. A.; Raff, D. A.

    2011-12-01

    A multivariate, stochastic nonparametric framework for stepwise disaggregation of seasonal runoff volumes to daily streamflow is presented. The downscaling process is conditional on volumes of spring runoff and large-scale ocean-atmosphere teleconnections and includes a two-level cascade scheme: seasonal-to-monthly disaggregation first followed by monthly-to-daily disaggregation. The non-parametric and assumption-free character of the framework allows consideration of the random nature and nonlinearities of daily flows, which parametric models are unable to account for adequately. This paper examines statistical links between decadal/interannual climatic variations in the Pacific Ocean and hydrologic variability in US northwest region, and includes a periodicity analysis of climate patterns to detect coherences of their cyclic behavior in the frequency domain. We explore the use of such relationships and selected signals (e.g., north Pacific gyre oscillation, southern oscillation, and Pacific decadal oscillation indices, NPGO, SOI and PDO, respectively) in the proposed data-driven framework by means of a combinatorial approach with the aim of simulating improved streamflow sequences when compared with disaggregated series generated from flows alone. A nearest neighbor time series bootstrapping approach is integrated with principal component analysis to resample from the empirical multivariate distribution. A volume-dependent scaling transformation is implemented to guarantee the summability condition. In addition, we present a new and simple algorithm, based on nonparametric resampling, that overcomes the common limitation of lack of preservation of historical correlation between daily flows across months. The downscaling framework presented here is parsimonious in parameters and model assumptions, does not generate negative values, and produces synthetic series that are statistically indistinguishable from the observations. We present evidence showing that both

  11. Multiatlas segmentation as nonparametric regression.

    Science.gov (United States)

    Awate, Suyash P; Whitaker, Ross T

    2014-09-01

    This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator's convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems.

  12. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major......This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  13. A Censored Nonparametric Software Reliability Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.

  14. Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Anestis Antoniadis

    2001-06-01

    Full Text Available Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced.

  15. 二维PCA非参数子空间分析的人脸识别算法%Face Recognition Algorithm of 2DPCA Nonparametric Subspace Analysis

    Institute of Scientific and Technical Information of China (English)

    王美; 梁久祯

    2011-01-01

    This paper proposes a novel face recognition algorithm of 2D Nonparametric Subspace Analysis(2DNSA) based on 2D Principal Componet Analysis(2DPCA) subspace. The original face matrices are performed to have feature dimension reduction, and the reduced feature matrices are used as a new training set, which can be conducted by 2D non-parametric subspace analysis. This method not only can reduce feature dimensions by 2DPCA, but also consider the impact of boundary samples for classification by taking full advantage of classification capacity of 2DNSA, which avoids the irrationality of using class centers to measure the distances of different classes. Experimental results on the two face databases(namely Yale and LARGE) show the improvements of the developed new algorithm over the traditional subspace methods such as (2D)2PCA, 2DPCA, (2D)2LDA, 2DLDA, 2DPCA+2DLDA, 2DNSA, etc.%提出一种结合二维PCA(2DPCA)的二维非参数子空间分析(2DNSA)人脸识别算法.利用2DPCA对原始图像矩阵进行特征降维,以降维后的特征为训练样本,进行二维非参数判别分析,并综合考虑类边界样本对分类的影响,采用2DNSA实现更合理的特征提取.基于Yale、LARGE人脸数据库的实验结果表明,与(2D)2pCA、2DPCA、(2D)2LDA、2DLDA、2DPCA+2DLDA、2DNSA算法相比,该算法性能更优.

  16. THE LISBON STRATEGY: AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Silvestri Marcello

    2010-07-01

    Full Text Available This paper investigates the European economic integration within the frame work of the 2000 Lisbon Council with the aim of studying the dynamics affecting the social and economic life of European Countries. Such a descriptive investigation focuses on certain significant variables of the new theories highlighting the importance of technological innovation and human capital. To this end the multivariate statistic technique of Principal Component Analysis has been applied in order to classify Countries with regard to the investigated phenomenon.

  17. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  18. Parametric and Nonparametric EEG Analysis for the Evaluation of EEG Activity in Young Children with Controlled Epilepsy

    Directory of Open Access Journals (Sweden)

    Vangelis Sakkalis

    2008-01-01

    Full Text Available There is an important evidence of differences in the EEG frequency spectrum of control subjects as compared to epileptic subjects. In particular, the study of children presents difficulties due to the early stages of brain development and the various forms of epilepsy indications. In this study, we consider children that developed epileptic crises in the past but without any other clinical, psychological, or visible neurophysiological findings. The aim of the paper is to develop reliable techniques for testing if such controlled epilepsy induces related spectral differences in the EEG. Spectral features extracted by using nonparametric, signal representation techniques (Fourier and wavelet transform and a parametric, signal modeling technique (ARMA are compared and their effect on the classification of the two groups is analyzed. The subjects performed two different tasks: a control (rest task and a relatively difficult math task. The results show that spectral features extracted by modeling the EEG signals recorded from individual channels by an ARMA model give a higher discrimination between the two subject groups for the control task, where classification scores of up to 100% were obtained with a linear discriminant classifier.

  19. A Unified Discussion on the Concept of Score Functions Used in the Context of Nonparametric Linkage Analysis

    Directory of Open Access Journals (Sweden)

    Lars Ängquist

    2008-01-01

    Full Text Available In this article we try to discuss nonparametric linkage (NPL score functions within a broad and quite general framework. The main focus of the paper is the structure, derivation principles and interpretations of the score function entity itself. We define and discuss several families of one-locus score function definitions, i.e. the implicit, explicit and optimal ones. Some generalizations and comments to the two-locus, unconditional and conditional, cases are included as well. Although this article mainly aims at serving as an overview, where the concept of score functions are put into a covering context, we generalize the noncentrality parameter (NCP optimal score functions in Ängquist et al. (2007 to facilitate—through weighting—for incorporation of several plausible distinct genetic models. Since the genetic model itself most oftenly is to some extent unknown this facilitates weaker prior assumptions with respect to plausible true disease models without loosing the property of NCP-optimality. Moreover, we discuss general assumptions and properties of score functions in the above sense. For instance, the concept of identical by descent (IBD sharing structures and score function equivalence are discussed in some detail.

  20. Empirical analysis of industrial operations in Montenegro

    Directory of Open Access Journals (Sweden)

    Galić Jelena

    2012-12-01

    Full Text Available Since the starting process of transition, industrial production in Montenegro has been faced with serious problems and its share in GDP is constantly decreasing. Global financial crises had in large extent negatively influenced industry. Analysis of financial indicators showed that industry had significant losses, problem of undercapitalisation and liquidity problems. If we look by industry sectors, than situation is more favourable in the production of electricity, gas and water compared to extracting industry and mining. In paper is proposed measures of economic policy in order to improve situation in industry.

  1. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  2. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  3. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  4. Universality in voting behavior: an empirical analysis

    Science.gov (United States)

    Chatterjee, Arnab; Mitrović, Marija; Fortunato, Santo

    2013-01-01

    Election data represent a precious source of information to study human behavior at a large scale. In proportional elections with open lists, the number of votes received by a candidate, rescaled by the average performance of all competitors in the same party list, has the same distribution regardless of the country and the year of the election. Here we provide the first thorough assessment of this claim. We analyzed election datasets of 15 countries with proportional systems. We confirm that a class of nations with similar election rules fulfill the universality claim. Discrepancies from this trend in other countries with open-lists elections are always associated with peculiar differences in the election rules, which matter more than differences between countries and historical periods. Our analysis shows that the role of parties in the electoral performance of candidates is crucial: alternative scalings not taking into account party affiliations lead to poor results.

  5. Empirical Analysis of Kyrgyz Trade Patterns

    Directory of Open Access Journals (Sweden)

    Elvira KURMANALIEVA

    2008-05-01

    Full Text Available Being naturally located between two big markets in Europe and Asia, Kyrgyzstan together with other Central Asian countries does not have a direct access to sea ports. Landlockedness limits volumes of international trade and creates obstacles for economic growth. Results of statistical analysis show that Kyrgyz trade neither follows Heckscher-Ohlin model nor intra-industry trade model. Another finding is that open and liberal trade policy of Kyrgyzstan has a large positive effect on trade volumes, suggesting that bilateral trade will expand markedly if country continues liberalization of its trade policy with other countries. Quality of infrastructure and transportation costs play a crucial role for landlocked countries and a free trade agreement with other countries looks like a good opportunity to overcome natural barriers and diversify their trade.

  6. Universality in voting behavior: an empirical analysis

    CERN Document Server

    Chatterjee, Arnab; Fortunato, Santo

    2012-01-01

    Election data represent a precious source of information to study human behavior at a large scale. In proportional elections with open lists, the number of votes received by a candidate, rescaled by the average performance of all competitors in the same party list, has the same distribution regardless of the country and the year of the election. Here we provide the first thorough assessment of this claim. We analyzed election datasets of 15 countries with proportional systems. We confirm that a class of nations with similar election rules fulfill the universality claim. Discrepancies from this trend in other countries with open-lists elections are always associated with peculiar differences in the election rules, which matter more than differences between countries and historical periods. Our analysis shows that the role of parties in the electoral performance of candidates is crucial: alternative scalings not taking into account party affiliations lead to poor results.

  7. Empirical analysis on risk of security investment

    Institute of Scientific and Technical Information of China (English)

    AN Peng; LI Sheng-hong

    2009-01-01

    The paper analyzes the theory and application of Markowitz Mean-Variance Model and CAPM model. Firstly, it explains the development process and standpoints of two models and deduces the whole process in detail. Then 30 stocks are choosen from Shangzheng 50 stocks and are testified whether the prices of Shanghai stocks conform to the two models. With the technique of time series and panel data analysis, the research on the stock risk and effective portfolio by ORIGIN and MATLAB software is conducted. The result shows that Shanghai stock market conforms to Markowitz Mean-Variance Model to a certain extent and can give investors reliable suggestion to gain higher return, but there is no positive relation between system risk and profit ratio and CAPM doesn't function well in China's security market.

  8. A Theoretical and Empirical Analysis of Expected Sarsa

    NARCIS (Netherlands)

    van Seijen, Harm; van Hasselt, Hado; Whiteson, Shimon; Wiering, Marco

    2009-01-01

    This paper presents a theoretical and empirical analysis of Expected Sarsa, a variation on Sarsa, the classic on policy temporal-difference method for model-free reinforcement learning. Expected Sarsa exploits knowledge about stochasticity in the behavior policy to perform updates with lower varianc

  9. A theoretical and empirical analysis of expected sarsa

    NARCIS (Netherlands)

    Seijen, H.H. van; Hasselt, H. van; Whiteson, S.; Wiering, M.

    2009-01-01

    This paper presents a theoretical and empirical analysis of Expected Sarsa, a variation on Sarsa, the classic onpolicy temporal-difference method for model-free reinforcement learning. Expected Sarsa exploits knowledge about stochasticity in the behavior policy to perform updates with lower variance

  10. Determinants of Crime in Virginia: An Empirical Analysis

    Science.gov (United States)

    Ali, Abdiweli M.; Peek, Willam

    2009-01-01

    This paper is an empirical analysis of the determinants of crime in Virginia. Over a dozen explanatory variables that current literature suggests as important determinants of crime are collected. The data is from 1970 to 2000. These include economic, fiscal, demographic, political, and social variables. The regression results indicate that crime…

  11. Empirical Bayes Model Comparisons for Differential Methylation Analysis

    Directory of Open Access Journals (Sweden)

    Mingxiang Teng

    2012-01-01

    Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  12. The 12-item World Health Organization Disability Assessment Schedule II (WHO-DAS II: a nonparametric item response analysis

    Directory of Open Access Journals (Sweden)

    Fernandez Ana

    2010-05-01

    Full Text Available Abstract Background Previous studies have analyzed the psychometric properties of the World Health Organization Disability Assessment Schedule II (WHO-DAS II using classical omnibus measures of scale quality. These analyses are sample dependent and do not model item responses as a function of the underlying trait level. The main objective of this study was to examine the effectiveness of the WHO-DAS II items and their options in discriminating between changes in the underlying disability level by means of item response analyses. We also explored differential item functioning (DIF in men and women. Methods The participants were 3615 adult general practice patients from 17 regions of Spain, with a first diagnosed major depressive episode. The 12-item WHO-DAS II was administered by the general practitioners during the consultation. We used a non-parametric item response method (Kernel-Smoothing implemented with the TestGraf software to examine the effectiveness of each item (item characteristic curves and their options (option characteristic curves in discriminating between changes in the underliying disability level. We examined composite DIF to know whether women had a higher probability than men of endorsing each item. Results Item response analyses indicated that the twelve items forming the WHO-DAS II perform very well. All items were determined to provide good discrimination across varying standardized levels of the trait. The items also had option characteristic curves that showed good discrimination, given that each increasing option became more likely than the previous as a function of increasing trait level. No gender-related DIF was found on any of the items. Conclusions All WHO-DAS II items were very good at assessing overall disability. Our results supported the appropriateness of the weights assigned to response option categories and showed an absence of gender differences in item functioning.

  13. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  14. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  15. A nonparametric and diversified portfolio model

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2014-07-01

    Traditional portfolio models, like mean-variance (MV) suffer from estimation error and lack of diversity. Alternatives, like mean-entropy (ME) or mean-variance-entropy (MVE) portfolio models focus independently on the issue of either a proper risk measure or the diversity. In this paper, we propose an asset allocation model that compromise between risk of historical data and future uncertainty. In the new model, entropy is presented as a nonparametric risk measure as well as an index of diversity. Our empirical evaluation with a variety of performance measures shows that this model has better out-of-sample performances and lower portfolio turnover than its competitors.

  16. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    施沛德; 王海燕; 张利华

    2000-01-01

    For regression analysis, some useful Information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literat黵e, but the optimal rates of global convergence have not been obtained yet. Because of the possible Information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression f unction based on right-censored response data, and proves, under some regularity condi-tions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtai

  17. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  18. Nonparametric statistical methods using R

    CERN Document Server

    Kloke, John

    2014-01-01

    A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

  19. Victim countries of transnational terrorism: an empirical characteristics analysis.

    Science.gov (United States)

    Elbakidze, Levan; Jin, Yanhong

    2012-12-01

    This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.

  20. Islamic banks and profitability: an empirical analysis of Indonesian banking

    OpenAIRE

    Jordan, Sarah

    2013-01-01

    This paper provides an empirical analysis of the factors that determine the profitability of Indonesian banks between the years 2006-2012. In particular, it investigates whether there are any significant differences in terms of profitability between Islamic banks and commercial banks. The results, obtained by applying the system-GMM estimator to the panel of 54 banks, indicate that the high bank profitability during these years were determined mainly by the size of the banks, the market share...

  1. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. A Critical Look at the Mass-Metallicity-SFR Relation in the Local Universe: Non-parametric Analysis Framework and Confounding Systematics

    CERN Document Server

    Salim, Samir; Ly, Chun; Brinchmann, Jarle; Davé, Romeel; Dickinson, Mark; Salzer, John J; Charlot, Stéphane

    2014-01-01

    It has been proposed that the mass-metallicity relation of galaxies exhibits a secondary dependence on star formation rate (SFR), and that the resulting M-Z-SFR relation may be redshift-invariant, i.e., "fundamental." However, conflicting results on the character of the SFR dependence, and whether it exists, have been reported. To gain insight into the origins of the conflicting results, we (a) devise a non-parametric, astrophysically-motivated analysis framework based on the offset from the star-forming ("main") sequence at a given stellar mass (relative specific SFR), (b) apply this methodology and perform a comprehensive re-analysis of the local M-Z-SFR relation, based on SDSS, GALEX, and WISE data, and (c) study the impact of sample selection, and of using different metallicity and SFR indicators. We show that metallicity is anti-correlated with specific SFR regardless of the indicators used. We do not find that the relation is spurious due to correlations arising from biased metallicity measurements, or ...

  3. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of

  4. Ratio Versus Regression Analysis: Some Empirical Evidence in Brazil

    Directory of Open Access Journals (Sweden)

    Newton Carneiro Affonso da Costa Jr.

    2004-06-01

    Full Text Available This work compares the traditional methodology for ratio analysis, applied to a sample of Brazilian firms, with the alternative one of regression analysis both to cross-industry and intra-industry samples. It was tested the structural validity of the traditional methodology through a model that represents its analogous regression format. The data are from 156 Brazilian public companies in nine industrial sectors for the year 1997. The results provide weak empirical support for the traditional ratio methodology as it was verified that the validity of this methodology may differ between ratios.

  5. The Utility of Nonparametric Transformations for Imputation of Survey Data

    Directory of Open Access Journals (Sweden)

    Robbins Michael W.

    2014-12-01

    Full Text Available Missing values present a prevalent problem in the analysis of establishment survey data. Multivariate imputation algorithms (which are used to fill in missing observations tend to have the common limitation that imputations for continuous variables are sampled from Gaussian distributions. This limitation is addressed here through the use of robust marginal transformations. Specifically, kernel-density and empirical distribution-type transformations are discussed and are shown to have favorable properties when used for imputation of complex survey data. Although such techniques have wide applicability (i.e., they may be easily applied in conjunction with a wide array of imputation techniques, the proposed methodology is applied here with an algorithm for imputation in the USDA’s Agricultural Resource Management Survey. Data analysis and simulation results are used to illustrate the specific advantages of the robust methods when compared to the fully parametric techniques and to other relevant techniques such as predictive mean matching. To summarize, transformations based upon parametric densities are shown to distort several data characteristics in circumstances where the parametric model is ill fit; however, no circumstances are found in which the transformations based upon parametric models outperform the nonparametric transformations. As a result, the transformation based upon the empirical distribution (which is the most computationally efficient is recommended over the other transformation procedures in practice.

  6. Empirical Likelihood Analysis of Longitudinal Data Involving Within-subject Correlation

    Institute of Scientific and Technical Information of China (English)

    Shuang HU; Lu LIN

    2012-01-01

    In this paper we use profile empirical likelihood to construct confidence regions for regression coefficients in partially linear model with longitudinal data.The main contribution is that the within-subject correlation is considered to improve estimation efficiency. We suppose a semi-parametric structure for the covariances of observation errors in each subject and employ both the first order and the second order moment conditions of the observation errors to construct the estimating equations.Although there are nonparametric estimators,the empirical log-likelihood ratio statistic still tends to a standard xp2 variable in distribution after the nuisance parameters are profiled away.A data simulation is also conducted.

  7. A note on the use of the non-parametric Wilcoxon-Mann-Whitney test in the analysis of medical studies

    Directory of Open Access Journals (Sweden)

    Kühnast, Corinna

    2008-04-01

    Full Text Available Background: Although non-normal data are widespread in biomedical research, parametric tests unnecessarily predominate in statistical analyses. Methods: We surveyed five biomedical journals and – for all studies which contain at least the unpaired t-test or the non-parametric Wilcoxon-Mann-Whitney test – investigated the relationship between the choice of a statistical test and other variables such as type of journal, sample size, randomization, sponsoring etc. Results: The non-parametric Wilcoxon-Mann-Whitney was used in 30% of the studies. In a multivariable logistic regression the type of journal, the test object, the scale of measurement and the statistical software were significant. The non-parametric test was more common in case of non-continuous data, in high-impact journals, in studies in humans, and when the statistical software is specified, in particular when SPSS was used.

  8. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  9. An Empirical Analysis of Odd Pricing Using PSM Data

    OpenAIRE

    Okuse, Yoshiyuki

    2016-01-01

    It is evident in our daily lives that most consumer goods are not sold at the just price, but rather at the just-below price. To examine the effect of odd pricing, including just-below pricing, numerous empirical studies have been conducted. In spite of these efforts, a consistent conclusion has not been obtained so far.The goals of this research are: (1) to examine the existence of the effect of odd pricing on consumers' price acceptance using PSM analysis, and (2) to examine the mechanisms ...

  10. The Use of Information Transmission as Nonparametric Correlation in the Analysis of Complex Behavior: A Preliminary Report.

    Science.gov (United States)

    1980-05-01

    kEgeering Psychology Programs a dOffice of Naval Research (Code. 455) Arlington, VA 22217 1 14 MONITORING AGENCY NAME A ADORESS(hI diferent 0000...STATEMENT (of the Abe~ag @amed tol 8110" 2. it Aliment *001 Repel) Approved for public release; distribution unlimited. If. SUPPLEMENTARY NOTES I...Structure ........ I i Introduction J Experimental psychology has developed a sophisticated set of experimental designs for the analysis of behaviour when

  11. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  12. The properties and mechanism of long-term memory in nonparametric volatility

    Science.gov (United States)

    Li, Handong; Cao, Shi-Nan; Wang, Yan

    2010-08-01

    Recent empirical literature documents the presence of long-term memory in return volatility. But the mechanism of the existence of long-term memory is still unclear. In this paper, we investigate the origin and properties of long-term memory with nonparametric volatility, using high-frequency time series data of the Chinese Shanghai Composite Stock Price Index. We perform Detrended Fluctuation Analysis (DFA) on three different nonparametric volatility estimators with different sampling frequencies. For the same volatility series, the Hurst exponents reduce as the sampling time interval increases, but they are still larger than 1/2, which means that no matter how the interval changes, it still cannot change the existence of long memory. RRV presents a relatively stable property on long-term memory and is less influenced by sampling frequency. RV and RBV have some evolutionary trends depending on time intervals, which indicating that the jump component has no significant impact on the long-term memory property. This suggests that the presence of long-term memory in nonparametric volatility can be contributed to the integrated variance component. Considering the impact of microstructure noise, RBV and RRV still present long-term memory under various time intervals. We can infer that the presence of long-term memory in realized volatility is not affected by market microstructure noise. Our findings imply that the long-term memory phenomenon is an inherent characteristic of the data generating process, not a result of microstructure noise or volatility clustering.

  13. Bayesian Nonparametric Estimation for Dynamic Treatment Regimes with Sequential Transition Times.

    Science.gov (United States)

    Xu, Yanxun; Müller, Peter; Wahed, Abdus S; Thall, Peter F

    2016-01-01

    We analyze a dataset arising from a clinical trial involving multi-stage chemotherapy regimes for acute leukemia. The trial design was a 2 × 2 factorial for frontline therapies only. Motivated by the idea that subsequent salvage treatments affect survival time, we model therapy as a dynamic treatment regime (DTR), that is, an alternating sequence of adaptive treatments or other actions and transition times between disease states. These sequences may vary substantially between patients, depending on how the regime plays out. To evaluate the regimes, mean overall survival time is expressed as a weighted average of the means of all possible sums of successive transitions times. We assume a Bayesian nonparametric survival regression model for each transition time, with a dependent Dirichlet process prior and Gaussian process base measure (DDP-GP). Posterior simulation is implemented by Markov chain Monte Carlo (MCMC) sampling. We provide general guidelines for constructing a prior using empirical Bayes methods. The proposed approach is compared with inverse probability of treatment weighting, including a doubly robust augmented version of this approach, for both single-stage and multi-stage regimes with treatment assignment depending on baseline covariates. The simulations show that the proposed nonparametric Bayesian approach can substantially improve inference compared to existing methods. An R program for implementing the DDP-GP-based Bayesian nonparametric analysis is freely available at https://www.ma.utexas.edu/users/yxu/.

  14. UN ANÁLISIS NO PARAMÉTRICO DE ÍTEMS DE LA PRUEBA DEL BENDER/A NONPARAMETRIC ITEM ANALYSIS OF THE BENDER GESTALT TEST MODIFIED

    Directory of Open Access Journals (Sweden)

    César Merino Soto

    2009-05-01

    Full Text Available Resumen:La presente investigación hace un estudio psicométrico de un nuevo sistema de calificación de la Prueba Gestáltica del Bendermodificada para niños, que es el Sistema de Calificación Cualitativa (Brannigan y Brunner, 2002, en un muestra de 244 niñosingresantes a primer grado de primaria en cuatro colegios públicos, ubicados en Lima. El enfoque usado es un análisis noparamétrico de ítems mediante el programa Testgraf (Ramsay, 1991. Los resultados indican niveles apropiados deconsistencia interna, identificándose la unidimensionalidad, y el buen nivel discriminativo de las categorías de calificación deeste Sistema Cualitativo. No se hallaron diferencias demográficas respecto al género ni la edad. Se discuten los presenteshallazgos en el contexto del potencial uso del Sistema de Calificación Cualitativa y del análisis no paramétrico de ítems en lainvestigación psicométrica.AbstracThis research designs a psychometric study of a new scoring system of the Bender Gestalt test modified to children: it is theQualitative Scoring System (Brannigan & Brunner, 2002, in a sample of 244 first grade children of primary level, in four public school of Lima. The approach aplied is the nonparametric item analysis using The test graft computer program (Ramsay, 1991. Our findings point to good levels of internal consistency, unidimensionality and good discriminative level ofthe categories of scoring from the Qualitative Scoring System. There are not demographic differences between gender or age.We discuss our findings within the context of the potential use of the Qualitative Scoring System and of the nonparametricitem analysis approach in the psychometric research.

  15. A nonparametric method for detecting fixations and saccades using cluster analysis: removing the need for arbitrary thresholds.

    Science.gov (United States)

    König, Seth D; Buffalo, Elizabeth A

    2014-04-30

    Eye tracking is an important component of many human and non-human primate behavioral experiments. As behavioral paradigms have become more complex, including unconstrained viewing of natural images, eye movements measured in these paradigms have become more variable and complex as well. Accordingly, the common practice of using acceleration, dispersion, or velocity thresholds to segment viewing behavior into periods of fixations and saccades may be insufficient. Here we propose a novel algorithm, called Cluster Fix, which uses k-means cluster analysis to take advantage of the qualitative differences between fixations and saccades. The algorithm finds natural divisions in 4 state space parameters-distance, velocity, acceleration, and angular velocity-to separate scan paths into periods of fixations and saccades. The number and size of clusters adjusts to the variability of individual scan paths. Cluster Fix can detect small saccades that were often indistinguishable from noisy fixations. Local analysis of fixations helped determine the transition times between fixations and saccades. Because Cluster Fix detects natural divisions in the data, predefined thresholds are not needed. A major advantage of Cluster Fix is the ability to precisely identify the beginning and end of saccades, which is essential for studying neural activity that is modulated by or time-locked to saccades. Our data suggest that Cluster Fix is more sensitive than threshold-based algorithms but comes at the cost of an increase in computational time. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. The Role of Temperature in Economic Exchange - An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Dimitrijević Bojan

    2015-09-01

    Full Text Available As a synthesis of economics and physics and an attempt to apply the methods and models of statistical physics to economics, econophysics is presently a new, growing and very dynamic branch of modern science. Therefore, the subject of this paper is to analyse the relationship and interdependence between thermodynamics and economics, and it aims to show similarities, analogies and correspondence between the main categories, methods and models of thermodynamics on one hand, and economics on the other. The paper analyses the relation between economics and thermodynamics, as well as the probability distribution in the kinetic theory of gases corresponding to money, income and wealth distribution, connects entropy with utility and the principle of operation of the thermal engine with economic exchange. The final part of the paper empirically analyzes temperature differences in the exchange between Serbia and the selected EU countries. There are differences in temperature between Serbia and the group of selected countries. Results of the empirical analysis shows that the exchange between countries is based on principles of thermodynamics and that developed countries generate more profits and benefits from exchange.

  17. Nonparametric estimation of ultrasound pulses

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Leeman, Sidney

    1994-01-01

    An algorithm for nonparametric estimation of 1D ultrasound pulses in echo sequences from human tissues is derived. The technique is a variation of the homomorphic filtering technique using the real cepstrum, and the underlying basis of the method is explained. The algorithm exploits a priori...

  18. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  19. Empirical Analysis of the Vegetable Industry in Hebei Province

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    We first introduce the status quo of the development of vegetable industry in Hebei Province,and then conduct empirical analysis of the development of vegetable industry in Hebei Province.Further,we analyze the development advantage of the vegetable industry in Hebei Province using SAI(Scale Advantage Indices) and SCA(Symmetric Comparative Advantage),drawing the conclusion that the vegetable industry in Hebei Province has much room for development;at the same time,we analyze the factors influencing vegetable consumption of residents in Hebei Province through the regression model,drawing the conclusion that the vegetable consumer price index is the main factor affecting the consumption.Finally we make recommendations for the development of vegetable industry in Hebei Province as follows:increasing financial input,promoting policy guarantee capacity;implementing brand strategy,promoting the competitiveness of products;improving the ecological environment,promoting industrialization of pollution-free vegetables.

  20. Empirical Analysis on Factors Influencing Distribution of Vegetal Production

    Institute of Scientific and Technical Information of China (English)

    Wenjie; WU

    2015-01-01

    Since the reform and opening-up,there has been a great change in spatial pattern of China’s vegetable production. This paper studied vegetable production in provinces of China in 1978- 2013. From the sequential characteristics,China’s vegetable production area is constantly growing and takes on stage characteristic. From the spatial distribution,China’s vegetable production takes on the trend of " going down the south" and " marching the west". In order to grasp rules of changes of vegetable production and the influence factors,this paper made theoretical and empirical analysis on factors possibly influencing distribution of vegetable production. Results show that major factors influencing distribution of China’s vegetable production include irrigation condition,non-agricultural employment,market demand,knowledge spillover,comparative effectiveness,rural road and government policies.

  1. ACCOUNTING POLICIES AND FINANCIAL ANALYSIS INTERDEPENDENCES - EMPIRICAL EVIDENCE

    Directory of Open Access Journals (Sweden)

    Nino Serdarević

    2011-06-01

    Full Text Available This paper presents empirical evidence on applied analysis interdependences with created accounting policies and estimates within Bosnia and Herzegovina (BIH private commercial entities, in specific, targeting practice oriented relevance of financial indicators, non-financial indicators, enterprise resource planning and management account-ting insight frequencies. Recently, standard setters (International Accounting Standards Board and International Federation of Accountants have published outcomes of an internationally organized research on financial reports usefulness, recommending enforced usage of enterprise relevant information, non-financial indicators and risks implications in assets and liabilities positions. These imply litigation and possible income smoothening. In regard to financial reporting reliability, many authors suggest accounting conservatism as a measure to compose risk assessment and earnings response ratio. Author argues that recently suggested financial management measures involving cash and assets management, liquidity ratios and turns do not directly imply accounting information quality, prior computed within applied accounting conservatism.

  2. Empirical Analysis: Business Cycles and Inward FDI in China

    Directory of Open Access Journals (Sweden)

    Qiyun Fang

    2007-01-01

    Full Text Available It is well-known that the current speeding-up of globalization has been, on one hand, spreading macro economic effects around the world, while, on the other, fueling firms’ activities of crossing national borders. Then, are there any links between these two influences? In this paper, we chose China as our subject, to try to clarify it. A set of models for Granger Causality test and VAR Impulse Responses were constructed and some econometric estimations and empirical analysis were made by employing the latest 20-year authorized annual statistic data. And the findings clearly indicated that firms’ (foreign activities (inward FDI do respond pro-cyclically to business cycle developments in a long term.

  3. Empirical Analysis of Xinjiang's Bilateral Trade: Gravity Model Approach

    Institute of Scientific and Technical Information of China (English)

    CHEN Xuegang; YANG Zhaoping; LIU Xuling

    2008-01-01

    Based on the basic trade gravity model and Xinjiang's practical situation, new explanatory variables (GDP,GDPpc and SCO) are introduced to build an extended trade gravity model fitting for Xinjiang's bilateral trade. Fromthe empirical analysis of this model, it is proposed that those three variables affect the Xinjiang's bilateral trade posi-tively. Whereas, geographic distance is found to be a significant factor influencing Xinjiang's bilateral trade negatively.Then, by the extended trade gravity model, this article analyzes the present trade situation between Xinjiang and itsmain trade partners quantitatively in 2004. The results indicate that Xinjiang cooperates with its most trade partnerssuccessfully in terms of present economic scale and developing revel. Xinjiang has established successfully trade part-nership with Central Asia, Central Europe and Eastern Europe, Western Europe, East Asia and South Asia. However,the foreign trade development with West Asia is much slower. Finally, some suggestions on developing Xinjiang's for-eign trade are put forward.

  4. Empirical Analysis and Automated Classification of Security Bug Reports

    Science.gov (United States)

    Tyo, Jacob P.

    2016-01-01

    With the ever expanding amount of sensitive data being placed into computer systems, the need for effective cybersecurity is of utmost importance. However, there is a shortage of detailed empirical studies of security vulnerabilities from which cybersecurity metrics and best practices could be determined. This thesis has two main research goals: (1) to explore the distribution and characteristics of security vulnerabilities based on the information provided in bug tracking systems and (2) to develop data analytics approaches for automatic classification of bug reports as security or non-security related. This work is based on using three NASA datasets as case studies. The empirical analysis showed that the majority of software vulnerabilities belong only to a small number of types. Addressing these types of vulnerabilities will consequently lead to cost efficient improvement of software security. Since this analysis requires labeling of each bug report in the bug tracking system, we explored using machine learning to automate the classification of each bug report as a security or non-security related (two-class classification), as well as each security related bug report as specific security type (multiclass classification). In addition to using supervised machine learning algorithms, a novel unsupervised machine learning approach is proposed. An ac- curacy of 92%, recall of 96%, precision of 92%, probability of false alarm of 4%, F-Score of 81% and G-Score of 90% were the best results achieved during two-class classification. Furthermore, an accuracy of 80%, recall of 80%, precision of 94%, and F-score of 85% were the best results achieved during multiclass classification.

  5. Nonparametric test for detecting change in distribution with panel data

    CERN Document Server

    Pommeret, Denys; Ghattas, Badih

    2011-01-01

    This paper considers the problem of comparing two processes with panel data. A nonparametric test is proposed for detecting a monotone change in the link between the two process distributions. The test statistic is of CUSUM type, based on the empirical distribution functions. The asymptotic distribution of the proposed statistic is derived and its finite sample property is examined by bootstrap procedures through Monte Carlo simulations.

  6. Measuring the Influence of Networks on Transaction Costs Using a Nonparametric Regression Technique

    DEFF Research Database (Denmark)

    Henningsen, Geraldine; Henningsen, Arne; Henning, Christian H.C.A.

    . We empirically analyse the effect of networks on productivity using a cross-validated local linear non-parametric regression technique and a data set of 384 farms in Poland. Our empirical study generally supports our hypothesis that networks affect productivity. Large and dense trading networks...

  7. Measuring the influence of networks on transaction costs using a non-parametric regression technique

    DEFF Research Database (Denmark)

    Henningsen, Géraldine; Henningsen, Arne; Henning, Christian H.C.A.

    . We empirically analyse the effect of networks on productivity using a cross-validated local linear non-parametric regression technique and a data set of 384 farms in Poland. Our empirical study generally supports our hypothesis that networks affect productivity. Large and dense trading networks...

  8. Empirical Analysis of Customer Behaviors in Chinese E-Commerce

    Directory of Open Access Journals (Sweden)

    Jinlong Wang

    2010-10-01

    Full Text Available With the burgeoning e-Business websites, E-Commerce in China has been developing rapidly in recent years. From the analysis of Chinese E-Commerce market, it is possible to discover customer purchasing patterns or behavior characteristics, which are indispensable knowledge for the expansion of Chinese E-Commerce market. This paper presents an empirical analysis on the sale transactions from the 360buy website based on the analysis of time interval distributions in perspectives of customers. Results reveal that in most situations the time intervals approximately obey the power-law distribution over two orders of magnitudes. Additionally, time interval on customer’s successive purchase can reflect how loyal a customer is to a specific product category. Moreover, we also find an interesting phenomenon about human behaviors that could be related to psychology of customers. In general, customers’ requirements in different product categories are similar. The investigation into individual behaviors may help researchers understand how customers’ group behaviors generated.

  9. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  10. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  11. Parametric versus non-parametric simulation

    OpenAIRE

    Dupeux, Bérénice; Buysse, Jeroen

    2014-01-01

    Most of ex-ante impact assessment policy models have been based on a parametric approach. We develop a novel non-parametric approach, called Inverse DEA. We use non parametric efficiency analysis for determining the farm’s technology and behaviour. Then, we compare the parametric approach and the Inverse DEA models to a known data generating process. We use a bio-economic model as a data generating process reflecting a real world situation where often non-linear relationships exist. Results s...

  12. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    Science.gov (United States)

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  13. Sources of China’s Economic Growth: An Empirical Analysis Based on the BML Index with Green Growth Accounting

    Directory of Open Access Journals (Sweden)

    Minzhe Du

    2014-09-01

    Full Text Available This study develops a biennial Malmquist–Luenberger  productivity index that is used to measure the sources of economic growth by utilizing data envelopment analysis and the directional distance function. Taking restrictions on resources and the environment into account based on the green growth accounting framework; we split economic growth into seven components: technical efficiency change, technological change, labor effect, capital effect, energy effect, output structure effect and environmental regulation effect. Further, we apply the Silverman test and Li-Fan-Ullah nonparametric test in combination with kernel distribution to test for the counterfactual contributions at the provincial level in China from 1998 to 2012. The empirical results show that: (1 technological progress and TFP make positive contributions to economic growth in China, while technical efficiency drags it down; (2 the effect of output structure and CO2 emissions with environmental regulation restrain economic growth in some provinces; and (3 overall, physical capital accumulation is the most important driving force for economic take-off, irrespective of whether the government adopts environmental regulations.

  14. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  15. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  16. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  17. Nonparametric Econometrics: The np Package

    Directory of Open Access Journals (Sweden)

    Tristen Hayfield

    2008-07-01

    Full Text Available We describe the R np package via a series of applications that may be of interest to applied econometricians. The np package implements a variety of nonparametric and semiparametric kernel-based estimators that are popular among econometricians. There are also procedures for nonparametric tests of significance and consistent model specification tests for parametric mean regression models and parametric quantile regression models, among others. The np package focuses on kernel methods appropriate for the mix of continuous, discrete, and categorical data often found in applied settings. Data-driven methods of bandwidth selection are emphasized throughout, though we caution the user that data-driven bandwidth selection methods can be computationally demanding.

  18. Astronomical Methods for Nonparametric Regression

    Science.gov (United States)

    Steinhardt, Charles L.; Jermyn, Adam

    2017-01-01

    I will discuss commonly used techniques for nonparametric regression in astronomy. We find that several of them, particularly running averages and running medians, are generically biased, asymmetric between dependent and independent variables, and perform poorly in recovering the underlying function, even when errors are present only in one variable. We then examine less-commonly used techniques such as Multivariate Adaptive Regressive Splines and Boosted Trees and find them superior in bias, asymmetry, and variance both theoretically and in practice under a wide range of numerical benchmarks. In this context the chief advantage of the common techniques is runtime, which even for large datasets is now measured in microseconds compared with milliseconds for the more statistically robust techniques. This points to a tradeoff between bias, variance, and computational resources which in recent years has shifted heavily in favor of the more advanced methods, primarily driven by Moore's Law. Along these lines, we also propose a new algorithm which has better overall statistical properties than all techniques examined thus far, at the cost of significantly worse runtime, in addition to providing guidance on choosing the nonparametric regression technique most suitable to any specific problem. We then examine the more general problem of errors in both variables and provide a new algorithm which performs well in most cases and lacks the clear asymmetry of existing non-parametric methods, which fail to account for errors in both variables.

  19. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  20. An Empirical Analysis of Foreign Direct Investment in Pakistan

    Directory of Open Access Journals (Sweden)

    Akbar Minhas

    2015-04-01

    Full Text Available The aim of this paper is to explore the trends in Foreign Direct Investment (FDI inflows in Pakistan and to identify the key determinants of FDI for the period of 2000-2013. The country experienced a continuous surge in FDI inflows from 2000-2008. On the contrary, the phase of 2009-2013 has been characterized by a persistent decline in FDI in Pakistan. This slump is mainly attributed to political and economic instability as wells as poor law and order situation in the country. Keeping these periods with differing results in perspective, multiple regression analysis is employed to empirically analyze the key determinants that are expected to explain variation in FDI in Pakistan. The selected variables were found significant determinants of FDI in Pakistan. Gross Domestic Product (GDP, degree of trade openness and regime of dictatorship have a significant positive effect on FDI. While, terrorism attacks foreign debt, exchange rate, political instability, and domestic capital formation are negatively significant determinants of FDI inflows in Pakistan. Considering the dynamic changes in the broad macro factors in economy, this study provides a fresh perspective on the factors that determine FDI in Pakistan. Moreover, the study findings provide important insights to policy makers to design policy measures that enhance FDI inflows in Pakistan.

  1. Separating environmental efficiency into production and abatement efficiency. A nonparametric model with application to U.S. power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin

    2011-08-15

    In this paper we present a new approach to evaluate the environmental efficiency of decision making units. We propose a model that describes a two-stage process consisting of a production and an end-of-pipe abatement stage with the environmental efficiency being determined by the efficiency of both stages. Taking the dependencies between the two stages into account, we show how nonparametric methods can be used to measure environmental efficiency and to decompose it into production and abatement efficiency. For an empirical illustration we apply our model to an analysis of U.S. power plants.

  2. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus.

    Directory of Open Access Journals (Sweden)

    Steven M Carr

    -stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  3. Towards MOOC for Technical Courses: A Blended Learning Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Siti Feirusz Ahmad Fesol

    2016-12-01

    Full Text Available Massive Open Online Learning (MOOC is one of the rapidly growing and the most trending online learning platform throughout the world. As reported by Class Central up until December 2015, there are more than a total of 4200 courses, which enrolled more than 35 million students and adopted by more than 500 universities all over the world. Thus, the objective of this study is to identify the students’ readiness towards MOOC technical courses based on blended learning approach. This study adapted quantitative based approach to analyse the data gathered.  Descriptive analysis and factor analysis are used to empirically analyse a total of 39 items on student attitude towards blended learning. This study successfully in developing six dimensions of student attitude towards the implementation of MOOC learning. The attributes namely are attitude towards learning flexibility, online learning, study management, technology, online interaction, and classroom learning. The findings summarized that, when students had a positive attitude towards learning flexibility, online learning, study management, technology, and online interaction, the students were more likely to adapt to blended learning and highly ready towards MOOC learning. On the other hand, when students had a positive attitude towards classroom learning, they were less likely ready towards MOOC learning, as they would prefer to meet their lecturers and friends in a physical lecture class compared to on the web-based. Understanding of student’s readiness towards MOOC learning based on blended learning approach is one of the critical success factors for implementing successful MOOC by higher learning institutions.

  4. An Empirical Analysis of Money Supply Process in Nepal

    OpenAIRE

    Prakash Kumar Shrestha Ph.D.

    2013-01-01

    This paper examines the money supply process in Nepal empirically on the basis of mainstream and Post-Keynesian theoretical perspectives for both pre and post-liberalization period covering the sample period of 1965/66-2009/10. The relative contribution of different components of money supply has been computed and the money supply as well as money multiplier function has been estimated. Empirical results show that disposable high powered money is found to be a major contributor to the change ...

  5. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    For regression analysis, some useful information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literature, but the optimal rates of global convergence have not been obtained yet. Because of the possible information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression function based on right_censored response data, and proves, under some regularity conditions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtain the asymptotic optimality of AIC, AICC, GCV, Cp and FPE criteria in the process of selecting the parameters.

  6. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    Science.gov (United States)

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library.

  7. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  8. Competition in the German pharmacy market: an empirical analysis.

    Science.gov (United States)

    Heinsohn, Jörg G; Flessa, Steffen

    2013-10-10

    Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an offensive and competitive stance are quite

  9. Competition in the German pharmacy market: an empirical analysis

    Science.gov (United States)

    2013-01-01

    Background Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. Methods This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. Results The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. Conclusions This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an

  10. An empirical analysis of cigarette demand in Argentina

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2014-01-01

    Objective To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Method Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. Results The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to −0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was −0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Conclusion Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. PMID:23760657

  11. An empirical analysis of cigarette demand in Argentina.

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2015-01-01

    To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to -0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was -0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Nonparametric regression with filtered data

    CERN Document Server

    Linton, Oliver; Nielsen, Jens Perch; Van Keilegom, Ingrid; 10.3150/10-BEJ260

    2011-01-01

    We present a general principle for estimating a regression function nonparametrically, allowing for a wide variety of data filtering, for example, repeated left truncation and right censoring. Both the mean and the median regression cases are considered. The method works by first estimating the conditional hazard function or conditional survivor function and then integrating. We also investigate improved methods that take account of model structure such as independent errors and show that such methods can improve performance when the model structure is true. We establish the pointwise asymptotic normality of our estimators.

  13. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  14. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    L. Cherchye; K. de Witte; E. Ooghe; I. Nicaise

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  15. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  16. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    Cherchye, L.; de Witte, K.; Ooghe, E.; Nicaise, I.

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  17. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  18. What Is a Reference Book? A Theoretical and Empirical Analysis.

    Science.gov (United States)

    Bates, Marcia J.

    1986-01-01

    Provides a definition of reference books based on their organizational structure and describes an empirical study which was conducted in three libraries to identify types of book organization and determine their frequency in reference departments and stack collections. The data are analyzed and shown to support the definition. (EM)

  19. WHAT FACTORS INFLUENCE QUALITY SERVICE IMPROVEMENT IN MONTENEGRO: EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Djurdjica Perovic

    2013-03-01

    Full Text Available In this paper, using an Ordinary Least Square regression (OLS, we investigate whether intangible elements influence tourist's perception about service quality. Our empirical results based on tourist survey in Montenegro, indicate that intangible elements of tourism product have positive impact on tourist's overall perception of service quality in Montenegro.

  20. An empirical analysis of asset-backed securitization

    NARCIS (Netherlands)

    Vink, D.; Thibeault, A.

    2007-01-01

    In this study we provide empirical evidence demonstrating a relationship between the nature of the assets and the primary market spread. The model also provides predictions on how other pricing characteristics affect spread, since little is known about how and why spreads of asset-backed securities

  1. An empirical analysis of asset-backed securitization

    NARCIS (Netherlands)

    Vink, D.; Thibeault, A.

    2007-01-01

    In this study we provide empirical evidence demonstrating a relationship between the nature of the assets and the primary market spread. The model also provides predictions on how other pricing characteristics affect spread, since little is known about how and why spreads of asset-backed securities

  2. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    Science.gov (United States)

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  3. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    With reference to a specific data set, we consider how to perform a flexible non-parametric Bayesian analysis of an inhomogeneous point pattern modelled by a Markov point process, with a location dependent first order term and pairwise interaction only. A priori we assume that the first order term...

  4. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of consump

  5. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.

  6. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  7. Lagrangian analysis of fluid transport in empirical vortex ring flows

    OpenAIRE

    Shadden, Shawn C.; Dabiri, John O.; Marsden, Jerrold E.

    2006-01-01

    In this paper we apply dynamical systems analyses and computational tools to fluid transport in empirically measured vortex ring flows. Measurements of quasisteadily propagating vortex rings generated by a mechanical piston-cylinder apparatus reveal lobe dynamics during entrainment and detrainment that are consistent with previous theoretical and numerical studies. In addition, the vortex ring wake of a free-swimming Aurelia aurita jellyfish is measured and analyzed in the framework of dynami...

  8. Islamic Banks and Financial Stability; An Empirical Analysis

    OpenAIRE

    Martin Cihak; Heiko Hesse

    2008-01-01

    The relative financial strength of Islamic banks is assessed empirically based on evidence covering individual Islamic and commercial banks in 18 banking systems with a substantial presence of Islamic banking. We find that (i) small Islamic banks tend to be financially stronger than small commercial banks; (ii) large commercial banks tend to be financially stronger than large Islamic banks; and (iii) small Islamic banks tend to be financially stronger than large Islamic banks, which may refle...

  9. Tax morale : theory and empirical analysis of tax compliance

    OpenAIRE

    Torgler, Benno

    2003-01-01

    Tax morale is puzzling in our society. Observations show that tax compliance cannot be satisfactorily explained by the level of enforcement. Other factors may well be relevant. This paper contains a short survey of important theoretical and empirical findings in the tax morale literature, focussing on personal income tax morale. The following three key topics are discussed: moral sentiments, fairness and the relationship between taxpayer and government. The survey stresses the ...

  10. An empirical analysis of after sales service and customer satisfaction

    OpenAIRE

    Hussain, Nazim; Waheed Akbar BHATTI; Azhar JILANI

    2011-01-01

    In today’s ever changing competitive environment, business cannot survive unless they satisfy their customers. The delivery of after sales service by a company is critical in satisfying customer needs and perceptions. In order to have quality after sales service a proper delivery system has to be in place. This is an empirical study on after sales quality of Pakistan’s automotive battery manufacturer. The research measured the quality of service in Atlas Battery, selling product with the bran...

  11. Islamic Banks and Financial Stability; An Empirical Analysis

    OpenAIRE

    Martin Cihak; Heiko Hesse

    2008-01-01

    The relative financial strength of Islamic banks is assessed empirically based on evidence covering individual Islamic and commercial banks in 18 banking systems with a substantial presence of Islamic banking. We find that (i) small Islamic banks tend to be financially stronger than small commercial banks; (ii) large commercial banks tend to be financially stronger than large Islamic banks; and (iii) small Islamic banks tend to be financially stronger than large Islamic banks, which may refle...

  12. Credit risk determinants analysis: Empirical evidence from Chinese commercial banks

    OpenAIRE

    LU, ZONGQI

    2013-01-01

    Abstract In order to investigate the potential determinants of credit risk in Chinese commercial banks, a panel dataset includes 342 bank-year observations from 2003 to 2012 in Chinese commercial banks are used to quantify the relationship between the selected variables and Chinese bank’s credit risk. Based on several robust test, the empirical results suggest the inflation rate and loan loss provision is significantly positive to Chinese commercial banks’ credit risk, on the other hand, m...

  13. The Rules of Standard Setting Organizations: an Empirical Analysis

    OpenAIRE

    Chiao, Benjamin; Lerner, Josh; Tirole, Jean

    2006-01-01

    This paper empirically explores the procedures employed by standard-setting organizations. Consistent with Lerner-Tirole (2004), we find (a) a negative relationship between the extent to which an SSO is oriented to technology sponsors and the concession level required of sponsors and (b) a positive correlation between the sponsor-friendliness of the selected SSO and the quality of the standard. We also develop and test two extensions of the earlier model: the presence of provisions mandating ...

  14. Nonparametric Regression with Common Shocks

    Directory of Open Access Journals (Sweden)

    Eduardo A. Souza-Rodrigues

    2016-09-01

    Full Text Available This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small number of factors. I investigate the properties of the Nadaraya-Watson kernel estimator and determine how general the common shocks can be while still obtaining meaningful kernel estimates. Restrictions on the common shocks are necessary because kernel estimators typically manipulate conditional densities, and conditional densities do not necessarily exist in the present case. By appealing to disintegration theory, I provide sufficient conditions for the existence of such conditional densities and show that the estimator converges in probability to the Kolmogorov conditional expectation given the sigma-field generated by the common shocks. I also establish the rate of convergence and the asymptotic distribution of the kernel estimator.

  15. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  16. An asymptotically optimal nonparametric adaptive controller

    Institute of Scientific and Technical Information of China (English)

    郭雷; 谢亮亮

    2000-01-01

    For discrete-time nonlinear stochastic systems with unknown nonparametric structure, a kernel estimation-based nonparametric adaptive controller is constructed based on truncated certainty equivalence principle. Global stability and asymptotic optimality of the closed-loop systems are established without resorting to any external excitations.

  17. Testing for a constant coefficient of variation in nonparametric regression

    OpenAIRE

    Dette, Holger; Marchlewski, Mareen; Wagener, Jens

    2010-01-01

    In the common nonparametric regression model Y_i=m(X_i)+sigma(X_i)epsilon_i we consider the problem of testing the hypothesis that the coefficient of the scale and location function is constant. The test is based on a comparison of the observations Y_i=\\hat{sigma}(X_i) with their mean by a smoothed empirical process, where \\hat{sigma} denotes the local linear estimate of the scale function. We show weak convergence of a centered version of this process to a Gaussian process under the null ...

  18. 交通拥堵持续时间的非参数生存分析%Nonparametric survival analysis of traffic congestion duration time

    Institute of Scientific and Technical Information of China (English)

    杨小宝; 周映雪

    2013-01-01

    The hazard-based traffic congestion duration model was established through survival analysis method. Based on the empirical traffic flow data on the Third Ring Road in Beijing, the traffic congestion duration of the Third Ring Road was estimated. The results show that 56% of the congestion durations of road segments on the Third Ring Road are not larger than four minutes. 90% of the congestion durations are not larger than 12 minutes. The hazard rate is less than 10% when the duration is larger than 12 minutes. The occurrence frequency of congestion on the outer ring is larger than the inner ring while the congestion duration on the inner ring is longer than the duration on the outer ring.%采用生存分析的非参数方法,建立基于风险的交通拥堵持续时间模型,根据北京市三环快速路的交通流数据,对三环道路交通拥堵持续时间进行了估计.结果表明:三环各路段的拥堵持续时间56%在4 min以内,90%在12 min之内;当拥堵持续时间超过12 min之后拥堵结束的可能性小于10%.外环比内环更容易发生拥堵,但当拥堵发生后内环的拥堵持续时间更长.

  19. OTC Derivatives and Global Economic Activity: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Gordon Bodnar

    2017-06-01

    Full Text Available That the global market for derivatives has expanded beyond recognition is well known. What is not know is how this market interacts with economic activity. We provide the first empirical characterization of interdependencies between OECD economic activity and the global OTC derivatives market. To this end, we apply a vector-error correction model to OTC derivatives disaggregated across instruments and counterparties. The results indicate that with one exception, the heterogeneity of OTC contracts is too pronounced to be reliably summarized by our measures of economic activity. The one exception is interest-rate derivatives held by Other Financial Institutions.

  20. Impact Factors of Energy Productivity in China: An Empirical Analysis

    Institute of Scientific and Technical Information of China (English)

    Wei Chu; Shen Manhong

    2007-01-01

    This article developed a decomposition model of energy productivity on the basis of the economic growth model. Four factors were considered which may influence China's energy productivity according to this model: technology improvement, resource allocation structure, industrial structure and institute arrangement. Then, an econometric model was employed to test the four factors empirically on the basis of China's statistical data from 1978 to 2004. Results indicated that capital deepening contributes the most (207%) to energy efficiency improvement, and impact from labor forces (13%) is the weakest one in resource factor; industrial structure (7%) and institute innovation (9.5%) positively improve the energy productivity.

  1. Empirical analysis on temporal statistics of human correspondence patterns

    Science.gov (United States)

    Li, Nan-Nan; Zhang, Ning; Zhou, Tao

    2008-11-01

    Recently, extensive empirical evidence shows that the timing of human behaviors obeys non-Possion statistics with heavy-tailed interevent time distribution. In this paper, we empirically study the correspondence pattern of a great Chinese scientist, named Hsue-Shen Tsien. Both the interevent time distribution and response time distributions deviate from the Poisson statistics, showing an approximate power-law decaying. The two power-law exponents are more or less the same (about 2.1), which strongly support the hypothesis in [A. Vázquez, J.G. Oliveira, Z. Dezsö, K.-I. Goh, I. Kondor, A.-L. Barabási, Phys. Rev. E 73 (2006) 036127] that the response time distribution of the tasks could in fact drive the interevent time distribution, and both the two distributions should decay with the same exponent. Our result is against the claim in [A. Vázquez, J.G. Oliveira, Z. Dezsö, K.-I. Goh, I. Kondor, A.-L. Barabási, Phys. Rev. E 73 (2006) 036127], which suggests the human correspondence pattern belongs to a universality class with exponent 1.5.

  2. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  3. The Impact of ICT on Educational Performance and its Efficiency in Selected EU and OECD Countries: A Non-Parametric Analysis

    Science.gov (United States)

    Aristovnik, Aleksander

    2012-01-01

    The purpose of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Moreover, a definition, measurements and the empirical application of a model measuring the efficiency of ICT use…

  4. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  5. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  6. Nonparametric methods in actigraphy: An update

    Science.gov (United States)

    Gonçalves, Bruno S.B.; Cavalcanti, Paula R.A.; Tavares, Gracilene R.; Campos, Tania F.; Araujo, John F.

    2014-01-01

    Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm) results for each time interval. Simulated data showed that (1) synchronization analysis depends on sample size, and (2) fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization. PMID:26483921

  7. Detection of Decreasing Vegetation Cover Based on Empirical Orthogonal Function and Temporal Unmixing Analysis

    OpenAIRE

    Di Xu; Ruishan Chen; Xiaoshi Xing; Wenpeng Lin

    2017-01-01

    Vegetation plays an important role in the energy exchange of the land surface, biogeochemical cycles, and hydrological cycles. MODIS (MODerate-resolution Imaging Spectroradiometer) EVI (Enhanced Vegetation Index) is considered as a quantitative indicator for examining dynamic vegetation changes. This paper applied a new method of integrated empirical orthogonal function (EOF) and temporal unmixing analysis (TUA) to detect the vegetation decreasing cover in Jiangsu Province of China. The empir...

  8. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-10-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  9. Security Governance – An Empirical Analysis of the Norwegian Context

    Directory of Open Access Journals (Sweden)

    Martin Nøkleberg

    2016-05-01

    Full Text Available This article explores the local security governance in the city of Bergen, and it thus highlights what characterizes security governance within a Norwegian context. The burgeoning policing literature suggests that we live in a pluralized and networked society – ideas of cooperation have thus been perceived as important features for the effectiveness in security governance. Cooperative relations between public and private actors are the main focus of this article and such arrangements are empirically explored in the city of Bergen. These relations are explored on the basis of the theoretical framework state anchored pluralism and nodal governance. The key finding is that there seems to be an unfulfilled potential in the security governance in Bergen. The public police have difficulties with cooperating with and exploiting the potential possessed by the private security industry. It is suggested that these difficulties are related to a mentality problem within the police institution, derived from nodal governance, that is, the police are influenced by a punishment mentality and view themselves as the only possible actor which can and should maintain the security.

  10. An Empirical Analysis on Credit Risk Models and its Application

    Directory of Open Access Journals (Sweden)

    Joocheol Kim

    2014-08-01

    Full Text Available This study intends to focus on introducing credit default risk with widely used credit risk models in an effort to empirically test whether the models hold their validity, apply to financial institutions which usually are highly levered with various types of debts, and finally reinterpret the results in computing adequate collateral level in the over-the-counter derivatives market. By calculating the distance-to-default values using historical market data for South Korean banks and brokerage firms as suggested in Merton model and KMV’s EDF model, we find that the performance of the introduced models well reflect the credit quality of the sampled financial institutions. Moreover, we suggest that in addition to the given credit ratings of different financial institutions, their distance-to-default values can be utilized in determining the sufficient level of credit support. Our suggested “smoothened” collateral level allows both contractual parties to minimize their costs caused from provision of collateral without undertaking additional credit risk and achieve efficient collateral management.

  11. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  12. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how....... For this purpose non-parametric methods together with additive models are suggested. Also, a new approach specifically designed to detect non-linearities is introduced. Confidence intervals are constructed by use of bootstrapping. As a link between non-parametric and parametric methods a paper dealing with neural...... the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...

  13. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    Science.gov (United States)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  14. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  15. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  16. Empirical analysis between industrial structure and economic growth of china

    Institute of Scientific and Technical Information of China (English)

    姚西龙; 尤津; 郝鹏飞

    2008-01-01

    In recent years,the relationship between the industrial structure and economic growth is more and more concerned by scholars. According to the theory of industrial structure and economic development, this article use regression analysis method that estimates the three major industries’ contribute to Chinese economic growth and use cluster analysis methods, then discuss how to optimize the indus-trial structure.

  17. Cost of Illness and Cost Containment Analysis Using Empirical Antibiotic Therapy in Sepsis Patients in Bandung

    Directory of Open Access Journals (Sweden)

    Rano K. Sinuraya

    2012-12-01

    Full Text Available The aims of this study were to analyze cost of illness (COI and cost containment analysis using empirical antibiotic therapy in sepsis patients with respiratory infection in a hospital in Bandung. A cross sectional method was conducted retrospectively. Data were collected from medical record of inpatients sepsis patients with respiratory infections with empirical antibiotic therapy ceftazidime-levofloxacin or cefotaxime-erythromycin. Direct and indirect cost were calculated and analyzed in this study. The result showed that the average COI for patients with combination ceftazidime-levofloxaxin was 13,369,055 IDR whereas combination of cefotaxime-erythromycin was 22,250,495 IDR. In summary, the COI empirical antibiotic therapy ceftazidime-levofloxacin was lower than cefotaxime-erythromycin. Cost containment using empirical antibiotic therapy ceftazidime-levofloxacin which without reducing the service quality was 8,881,440 IDR.

  18. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  19. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  20. Nonparametric estimation for hazard rate monotonously decreasing system

    Institute of Scientific and Technical Information of China (English)

    Han Fengyan; Li Weisong

    2005-01-01

    Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.

  1. Non-parametric versus parametric methods in environmental sciences

    Directory of Open Access Journals (Sweden)

    Muhammad Riaz

    2016-01-01

    Full Text Available This current report intends to highlight the importance of considering background assumptions required for the analysis of real datasets in different disciplines. We will provide comparative discussion of parametric methods (that depends on distributional assumptions (like normality relative to non-parametric methods (that are free from many distributional assumptions. We have chosen a real dataset from environmental sciences (one of the application areas. The findings may be extended to the other disciplines following the same spirit.

  2. An empirical EEG analysis in brain death diagnosis for adults.

    Science.gov (United States)

    Chen, Zhe; Cao, Jianting; Cao, Yang; Zhang, Yue; Gu, Fanji; Zhu, Guoxian; Hong, Zhen; Wang, Bin; Cichocki, Andrzej

    2008-09-01

    Electroencephalogram (EEG) is often used in the confirmatory test for brain death diagnosis in clinical practice. Because EEG recording and monitoring is relatively safe for the patients in deep coma, it is believed to be valuable for either reducing the risk of brain death diagnosis (while comparing other tests such as the apnea) or preventing mistaken diagnosis. The objective of this paper is to study several statistical methods for quantitative EEG analysis in order to help bedside or ambulatory monitoring or diagnosis. We apply signal processing and quantitative statistical analysis for the EEG recordings of 32 adult patients. For EEG signal processing, independent component analysis (ICA) was applied to separate the independent source components, followed by Fourier and time-frequency analysis. For quantitative EEG analysis, we apply several statistical complexity measures to the EEG signals and evaluate the differences between two groups of patients: the subjects in deep coma, and the subjects who were categorized as brain death. We report statistically significant differences of quantitative statistics with real-life EEG recordings in such a clinical study, and we also present interpretation and discussions on the preliminary experimental results.

  3. Using Confirmatory Factor Analysis for Construct Validation: An Empirical Review

    Science.gov (United States)

    DiStefano, Christine; Hess, Brian

    2005-01-01

    This study investigated the psychological assessment literature to determine what applied researchers are using and reporting from confirmatory factor analysis (CFA) studies for evidence of construct validation. One hundred and one articles published in four major psychological assessment journals between 1990 and 2002 were systematically…

  4. INVESTIGATING THE "COMPLEMENTARITY HYPOTHESIS" IN GREEK AGRICULTURE: AN EMPIRICAL ANALYSIS

    OpenAIRE

    Katrakilidis, Constantinos P.; Tabakis, Nikolaos M.

    2001-01-01

    This study investigates determinants of private capital formation in Greek agriculture and tests the "complementarity" against the "crowding out" hypothesis using multivariate cointegration techniques and ECVAR modeling in conjunction with variance decomposition and impulse response analysis. The results provide evidence of a significant positive causal effect of government spending on private capital formation, thus supporting the "complementarity" hypothesis for Greek agriculture.

  5. 188 An Empirical Investigation of Value-Chain Analysis and ...

    African Journals Online (AJOL)

    User

    This research work was designed to examine the impact of the Value-Chain. Analysis on Competitive ... market create economic value and when few competing firms are engaging in .... Trace costs of activities – The company needs an accounting ... margins, return on assets, benchmarking, and capital budgeting. When a.

  6. Actuarial and actual analysis of surgical results: empirical validation.

    Science.gov (United States)

    Grunkemeier, G L; Anderson, R P; Starr, A

    2001-06-01

    This report validates the use of the Kaplan-Meier (actuarial) method of computing survival curves by comparing 12-year estimates published in 1978 with current assessments. It also contrasts cumulative incidence curves, referred to as "actual" analysis in the cardiac-related literature with Kaplan-Meier curves for thromboembolism and demonstrates that with the former estimate the percentage of events that will actually occur.

  7. Dissecting Situational Strength: Theoretical Analysis and Empirical Tests

    Science.gov (United States)

    2012-09-01

    approaches such as difference scores and profile similarity indices (see Edwards , 2007; Shanock, Baran, Gentry, Pattison, & Heggestad, 2010). In addition...and (2) via analysis of indirect, actual measures of fit through polynomial regression and response surfaces ( Edwards , 2007; Shanock et al., 2010...Thoresen, Bono , & Patton, 2001; see also Herman, 1973; Smith, 1977) suggests that job attitudes are related to job performance more strongly in situations

  8. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    Directory of Open Access Journals (Sweden)

    Jamal J Almenayes

    2015-10-01

    Full Text Available This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996.  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Results indicated that social media addiction had three factors; "Social Consequences", "Time Displacement" and "Compulsive feelings.  Religiosity, on the other hand, contained a single factor.  Both of these results were arrived at using factor analysis of their respective scales. The relationship between religiosity and social media addiction was then examined using linear regression.  The results indicated that only two of the addiction factors were significantly related to religiosity.  Future research should address the operationalization of the concept of religiosity to account for multiple dimensions.

  9. IFRS and Stock Returns: An Empirical Analysis in Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo F. Malaquias

    2016-09-01

    Full Text Available In recent years, the convergence of accounting standards has been an issue that motivated new studies in the accounting field. It is expected that the convergence provides users, especially external users of accounting information, with comparable reports among different economies. Considering this scenario, this article was developed in order to compare the effect of accounting numbers on the stock market before and after the accounting convergence in Brazil. The sample of the study involved Brazilian listed companies at BM&FBOVESPA that had American Depository Receipts (levels II and III at the New York Stock Exchange (NYSE. For data analysis, descriptive statistics and graphic analysis were employed in order to analyze the behavior of stock returns around the publication dates. The main results indicate that the stock market reacts to the accounting reports. Therefore, the accounting numbers contain relevant information for the decision making of investors in the stock market. Moreover, it is observed that after the accounting convergence, the stock returns of the companies seem to present lower volatility.

  10. Tourism Competitiveness Index – An Empirical Analysis Romania vs. Bulgaria

    Directory of Open Access Journals (Sweden)

    Mihai CROITORU

    2011-09-01

    Full Text Available In the conditions of the current economic downturn, many specialists consider tourism as one of the sectors with the greatest potential to provide worldwide economic growth and development. A growing tourism sector can contribute effectively to employment, increase national income, and can also make a decisive mark on the balance of payments. Thus, tourism can be an important driving force for growth and prosperity, especially in emerging economies, being a key element in reducing poverty and regional disparities. Despite its contribution to economic growth, tourism sector development can be undermined by a series of economic and legislative barriers that can affect the competitiveness of this sector. In this context, the World Economic Forum proposes, via the Tourism Competitiveness Index (TCI, in addition to a methodology to identify key factors that contribute to increasing tourism competitiveness, tools for analysis and evaluation of these factors. In this context, this paper aims to analyze the underlying determinants of TCI from the perspective of two directly competing states, Romania and Bulgaria in order to highlight the effects of communication on the competitiveness of the tourism sector. The purpose of this analysis is to provide some answers, especially in terms of communication strategies, which may explain the completely different performances of the two national economies in the tourism sector.

  11. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  12. Why preferring parametric forecasting to nonparametric methods?

    Science.gov (United States)

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Empirical Analysis of Urban Residents’ Perceived Climatic Change Risks

    Institute of Scientific and Technical Information of China (English)

    Peihui; DAI; Lingling; HUANG

    2014-01-01

    The impact of climate change on human survival and security,urban development is even more profound,and receives more and more attention. To explore the perceived status of urban residents for the risks of climate change and put forward corresponding countermeasures and suggestions,taking Wuhan for example,from the microscopic point of urban residents,we use factor analysis to classify the perceived risks and recognized risk reduction measures,use cluster analysis to divide the urban residents into five groups,and use variance analysis to explore differences in the choice of measures between different cluster groups. We draw the following conclusions: the risk of deterioration of the ecological environment,the risk of economic damage,the risk of damage to the mental health,the risk of damage to the physical health and the risk of damage to the political harmony are the main risks of climate change for urban residents; individuals and families to develop good habits,businesses and governments to strengthen energy conservation,schools and other agencies to carry on the propaganda and education,carrying out multi-agent environment improvement,learn from the West are their recognized risk reduction measures. Depending on the perceived risk,the urban residents are clustered into five groups: those who are concerned about the body and politics,those who are concerned about the mental health,those who are concerned about the economic development,those who are concerned about the ecological safety,and those who ignore the climatic change. For the roles of individual and the family,business and government in the environmental protection,different groups have unanimous views,while for other measures,different groups have different understanding. It is concluded that individuals and families to develop environmentally friendly habits,government to strengthen regulation,businesses to take environmental responsibility,schools to strengthen publicity and education,and exploring

  14. Financial development and economic growth. An empirical analysis for Ireland

    Directory of Open Access Journals (Sweden)

    Antonios Adamopoulos

    2010-07-01

    Full Text Available This study investigated the relationship between financial development and economicgrowth for Ireland for the period 1965-2007 using a vector error correction model (VECM.Questions were raised whether financial development causes economic growth or reverselytaking into account the positive effect of industrial production index. Financial marketdevelopment is estimated by the effect of credit market development and stock marketdevelopment on economic growth. The objective of this study was to examine the long-runrelationship between these variables applying the Johansen cointegration analysis takinginto account the maximum eigenvalues and trace statistics tests. Granger causality testsindicated that economic growth causes credit market development, while there is a bilateralcausal relationship between stock market development and economic growth. Therefore, itcan be inferred that economic growth has a positive effect on stock market development andcredit market development taking into account the positive effect of industrial productiongrowth on economic growth for Ireland.

  15. Empirical analysis of the effects of cyber security incidents.

    Science.gov (United States)

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  16. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  17. Empirical Analysis of Agricultural Production Efficiency in Shaanxi Province

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    This article analyses the agricultural production efficiency of all cities and areas in Shaanxi Province in the period 2006-2009 using data envelopment analysis method,and compares the agricultural production efficiency between all cities and areas.The results show that the agricultural production efficiency and scale efficiency of agriculture of Shaanxi Province are high on the whole,but the efficiency of agricultural technology is very low,agricultural development still relies on factor inputs,and the driving role of technological progress is not conspicuous.Finally the following countermeasures are put forward to promote agricultural productivity in Shaanxi Province:improve the construction of agricultural infrastructure,and increase agricultural input;accelerate the project of extending agricultural technology into households,and promote the conversion and use rate of agricultural scientific and technological achievements;establish and improve industrial system of agriculture,and speed up the building of various agricultural cooperative economic organizations.

  18. Empirical Analysis of Bagged SVM Classifier for Data Mining Applications

    Directory of Open Access Journals (Sweden)

    M.Govindarajan

    2013-11-01

    Full Text Available Data mining is the use of algorithms to extract the information and patterns derived by the knowledge discovery in databases process. Classification maps data into predefined groups or classes. It is often referred to as supervised learning because the classes are determined before examining the data. The feasibility and the benefits of the proposed approaches are demonstrated by the means of data mining applications like intrusion detection, direct marketing, and signature verification. A variety of techniques have been employed for analysis ranging from traditional statistical methods to data mining approaches. Bagging and boosting are two relatively new but popular methods for producing ensembles. In this work, bagging is evaluated on real and benchmark data sets of intrusion detection, direct marketing, and signature verification in conjunction with as the base learner. The proposed is superior to individual approach for data mining applications in terms of classification accuracy.

  19. Empirical Analysis II: Business Cycles and Inward FDI in China

    Directory of Open Access Journals (Sweden)

    Yahya Sharahili

    2008-01-01

    Full Text Available It is well-known that the current speeding-up of globalization has been, on one hand, spreading macro economic effects around the world, while, on the other, fueling firms’ activities of crossing national borders. Then, are there any links between these two influences? As we have concluded in previous research that inward FDI and business cycle development do pro-cyclically relate on Granger base, this paper will further discuss “how do they react to each other?” Again, we chose China as subject and employed the 1983~2004 authorized annual statistic data. By constructing an Endogenous Growth model, we, after processing Correlation Analysis and testing the coefficient significance of each variable, found out the original momentum of Chinese economic growth and explored whether there exist some long-term relationship through Johansen Co-integration Test.

  20. STOCK MARKET DEVELOPMENT AND ECONOMIC GROWTH AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vazakidis Athanasios

    2012-01-01

    Full Text Available This study investigated the causal relationship between stock market development and economic growth for Greece for the period 1978-2007 using a Vector Error Correction Model (VECM. Questions were raised whether stock market development causes economic growth taking into account the negative effect of interest rate on stock market development. The purpose of this study was to investigate the short-run and the long-run relationship between the examined variables applying the Johansen co-integration analysis. To achieve this objective unit root tests were carried out for all time series data in their levels and their first differences. Johansen co-integration analysis was applied to examine whether the variables are co-integrated of the same order taking into account the maximum eigenvalues and trace statistics tests. Finally, a vector error correction model was selected to investigate the long-run relationship between stock market development and economic growth. A short-run increase of economic growth per 1% induced an increase of stock market index 0.41% in Greece, while an increase of interest rate per 1% induced a relative decrease of stock market index per 1.42% in Greece. The estimated coefficient of error correction term was statistically significant and had a negative sign, which confirmed that there was not any problem in the long-run equilibrium between the examined variables. The results of Granger causality tests indicated that there is a unidirectional causality between stock market development and economic growth with direction from economic growth to stock market development and a unidirectional causal relationship between economic growth and interest rate with direction from economic growth to interest rate. Therefore, it can be inferred that economic growth has a direct positive effect on stock market development while interest rate has a negative effect on stock market development and economic growth respectively.

  1. The Effect of Shocks: An Empirical Analysis of Ethiopia

    Directory of Open Access Journals (Sweden)

    Yilebes Addisu Damtie

    2015-07-01

    Full Text Available Besides striving for the increase of production and development, it is also necessary to reduce the losses created by the shocks. The people of Ethiopia are exposed to the impact of both natural and man-made shocks. Following this, policy makers, governmental and non-governmental organizations need to identify the important shocks and their effect and use as an input. This study was conducted to identify the food insecurity shocks and to estimate their effect based on the conceptual framework developed in Ethiopia, Amhara National Regional State of Libo Kemkem District. Descriptive statistical analysis, multiple regression, binary logistic regression, chi-squared and independent sample t-test were used as a data analysis technique. The results showed eight shocks affecting households which were weather variability, weed, plant insect and pest infestation, soil fertility problem, animal disease and epidemics, human disease and epidemics, price fluctuation problem and conflict. Weather variability, plant insect and pest infestation, weed, animal disease and epidemics created a mean loss of 3,821.38, 886.06, 508.04 and 1,418.32 Birr, respectively. In addition, human disease and epidemics, price fluctuation problem and conflict affected 68.11%, 88.11% and 14.59% of households, respectively. Among the sample households 28,1 % were not able to meet their food need throughout the year while 71,9 % could. The result of the multiple regression models revealed that weed existence (β = –0,142, p < 0,05, plant insect and pest infestation (β = –0,279, p < 0,01 and soil fertility problem (β = –0,321, p < 0,01 had significant effect on income. Asset was found significantly affected by plant insect and pest infestation (β = –0,229, p < 0,01, human disease and epidemics (β = 0,145, p < 0,05, and soil fertility problem (β = –0,317, p < 0,01 while food production was affected by soil fertility problem (β = –0,314, p < 0,01. Binary logistic

  2. Yield Stability of Maize Hybrids Evaluated in Maize Regional Trials in Southwestern China Using Nonparametric Methods

    Institute of Scientific and Technical Information of China (English)

    LIU Yong-jian; DUAN Chuan; TIAN Meng-liang; HU Er-liang; HUANG Yu-bi

    2010-01-01

    Analysis of multi-environment trials (METs) of crops for the evaluation and recommendation of varieties is an important issue in plant breeding research. Evaluating on the both stability of performance and high yield is essential in MET analyses. The objective of the present investigation was to compare 11 nonparametric stability statistics and apply nonparametric tests for genotype-by-environment interaction (GEI) to 14 maize (Zea mays L.) genotypes grown at 25 locations in southwestern China during 2005. Results of nonparametric tests of GEI and a combined ANOVA across locations showed that both crossover and noncrossover GEI, and genotypes varied highly significantly for yield. The results of principal component analysis, correlation analysis of nonparametric statistics, and yield indicated the nonparametric statistics grouped as four distinct classes that corresponded to different agronomic and biological concepts of stability.Furthermore, high values of TOP and low values of rank-sum were associated with high mean yield, but the other nonparametric statistics were not positively correlated with mean yield. Therefore, only rank-sum and TOP methods would be useful for simultaneously selection for high yield and stability. These two statistics recommended JY686 and HX 168 as desirable and ND 108, CM 12, CN36, and NK6661 as undesirable genotypes.

  3. An empirical study of tourist preferences using conjoint analysis

    Directory of Open Access Journals (Sweden)

    Tripathi, S.N.

    2010-01-01

    Full Text Available Tourism and hospitality have become key global economic activities as expectations with regard to our use of leisure time have evolved, attributing greater meaning to our free time. While the growth in tourism has been impressive, India's share in total global tourism arrivals and earnings is quite insignificant. It is an accepted fact that India has tremendous potential for development of tourism. This anomaly and the various underlying factors responsible for it are the focus of our study. The objective being determination of customer preferences for multi attribute hybrid services like tourism, so as to enable the state tourism board to deliver a desired combination of intrinsic attributes, helping it to create a sustainable competitive advantage, leading to greater customer satisfaction and positive word of mouth. Conjoint analysis has been used for this purpose, which estimates the structure of a consumer’s preferences, given his/her overall evaluations of a set of alternatives that are pre-specified in terms of levels of different attributes.

  4. EMPIRICAL ANALYSIS OF EMPLOYEES WITH TERTIARY EDUCATION OCCUPATIONAL IMBALANCES

    Directory of Open Access Journals (Sweden)

    Andrei B. Ankudinov

    2013-01-01

    Full Text Available High percentage of graduates (among the highest in the world with university degrees combined with unacceptably low level of utilization of acquired qualifications and generally low quality of education in the majority of Russian universities result in huge structural imbalances. The article presents quantitative estimates of disproportions between the educational levels of employees with higher education and their professional occupation for different branches of economy. A logit models based analysis is performed of how much the professional functions performed by employees match their professional qualifications, levels of education and the extent to which their previously acquired expertise and skills are put into use. The sample used represents working population of Russia with tertiary education. The results obtained allow to the conclusion that worst disproportions between education levels attained and job requirements are observed in trade and services sector, transport and communications, housing and utilities, consumer goods and food industries. The matters are compounded by inert and informationally inefficient labor market, incapable of sending proper signals to national system of tertiary education.

  5. Empirical Analysis on CSR Communication in Romania: Transparency and Participation

    Directory of Open Access Journals (Sweden)

    Irina-Eugenia Iamandi

    2012-12-01

    Full Text Available In the specific field of corporate social responsibility (CSR, the participation of companies in supporting social and environmental issues is mainly analysed and/or measured based on their CSR communication policy; in this way, the transparency of the CSR reporting procedures is one of the most precise challenges for researchers and practitioners in the field. The main research objective of the present paper is to distinguish between different types of CSR participation by identifying the reasons behind CSR communication for a series of companies acting on the Romanian market. The descriptive analysis – conducted both at integrated and corporate level for the Romanian companies – took into account five main CSR communication related issues: CSR site, CSR report, CSR listing, CSR budget and CSR survey. The results highlight both the declarative/prescriptive and practical/descriptive perspectives of CSR communication in Romania, showing that the Romanian CSR market is reaching its full maturity. In more specific terms, the majority of the investigated companies are already using different types of CSR participation, marking the transition from CSR just for commercial purposes to CSR for long-term strategic use. The achieved results are broadly analysed in the paper and specific conclusions are emphasized.

  6. An empirical analysis of the Ebola outbreak in West Africa

    Science.gov (United States)

    Khaleque, Abdul; Sen, Parongama

    2017-02-01

    The data for the Ebola outbreak that occurred in 2014-2016 in three countries of West Africa are analysed within a common framework. The analysis is made using the results of an agent based Susceptible-Infected-Removed (SIR) model on a Euclidean network, where nodes at a distance l are connected with probability P(l) ∝ l-δ, δ determining the range of the interaction, in addition to nearest neighbors. The cumulative (total) density of infected population here has the form , where the parameters depend on δ and the infection probability q. This form is seen to fit well with the data. Using the best fitting parameters, the time at which the peak is reached is estimated and is shown to be consistent with the data. We also show that in the Euclidean model, one can choose δ and q values which reproduce the data for the three countries qualitatively. These choices are correlated with population density, control schemes and other factors. Comparing the real data and the results from the model one can also estimate the size of the actual population susceptible to the disease. Rescaling the real data a reasonably good quantitative agreement with the simulation results is obtained.

  7. An empirical analysis of the Ebola outbreak in West Africa

    CERN Document Server

    Khaleque, Abdul

    2016-01-01

    The data for the Ebola outbreak that occurred in 2014-2016 in three countries of West Africa are analysed within a common framework. The analysis is made using the results of an agent based Susceptible-Infected-Removed (SIR) model on a Euclidean network, where nodes at a distance $l$ are connected with probability $P(l) \\propto l^{-\\delta }$ in addition to nearest neighbors. The cumulative density of infected population here has the form $R(t) = \\frac{a\\exp(t/T)}{1+c\\exp(t/T)}$, where the parameters depend on $\\delta$ and the infection probability $q$. This form is seen to fit well with the data. Using the best fitting parameters, the time at which the peak is reached is estimated and is shown to be consistent with the data. We also show that in the Euclidean model, one can choose $\\delta$ and $q$ values which reproduce the data for the three countries qualitatively. These choices are correlated with population density, control schemes and other factors.

  8. Non-parametric Morphologies of Mergers in the Illustris Simulation

    CERN Document Server

    Bignone, Lucas A; Sillero, Emanuel; Pedrosa, Susana E; Pellizza, Leonardo J; Lambas, Diego G

    2016-01-01

    We study non-parametric morphologies of mergers events in a cosmological context, using the Illustris project. We produce mock g-band images comparable to observational surveys from the publicly available Illustris simulation idealized mock images at $z=0$. We then measure non parametric indicators: asymmetry, Gini, $M_{20}$, clumpiness and concentration for a set of galaxies with $M_* >10^{10}$ M$_\\odot$. We correlate these automatic statistics with the recent merger history of galaxies and with the presence of close companions. Our main contribution is to assess in a cosmological framework, the empirically derived non-parametric demarcation line and average time-scales used to determine the merger rate observationally. We found that 98 per cent of galaxies above the demarcation line have a close companion or have experienced a recent merger event. On average, merger signatures obtained from the $G-M_{20}$ criteria anticorrelate clearly with the elapsing time to the last merger event. We also find that the a...

  9. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    Science.gov (United States)

    Lee, Kyungbook; Song, Seok Goo

    2016-10-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events (M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  10. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  11. Does risk management contribute to IT project success? A meta-analysis of empirical evidence

    NARCIS (Netherlands)

    de Bakker, K.F.C.; Boonstra, A.; Wortmann, J.C.

    2010-01-01

    The question whether risk management contributes to IT project success is considered relevant by people from both academic and practitioners' communities already for a long time. This paper presents a meta-analysis of the empirical evidence that either supports or opposes the claim that risk managem

  12. Some Empirical Issues in Research on Academic Departments: Homogeneity, Aggregation, and Levels of Analysis.

    Science.gov (United States)

    Ramsey, V. Jean; Dodge, L. Delf

    1983-01-01

    The appropriateness of using academic departments as a level of analysis of organizational administration is examined. Factors analyzed include homogeneity of faculty responses to measures of organizational structure, environmental uncertainty, and task routineness. Results were mixed, demonstrating the importance of empirically testing rather…

  13. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole;

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...

  14. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    Science.gov (United States)

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  15. An empirical analysis of the relationship between the consumption of alcohol and liver cirrhosis mortality

    DEFF Research Database (Denmark)

    Bentzen, Jan Børsen; Smith, Valdemar

    The question whether intake of alcohol is associated with liver cirrhosis mortality is analyzed using aggregate data for alcohol consumption, alcohol related diseases and alcohol policies of 16 European countries. The empirical analysis gives support to a close association between cirrhosis...

  16. Image Retrieval: Theoretical Analysis and Empirical User Studies on Accessing Information in Images.

    Science.gov (United States)

    Ornager, Susanne

    1997-01-01

    Discusses indexing and retrieval for effective searches of digitized images. Reports on an empirical study about criteria for analysis and indexing digitized images, and the different types of user queries done in newspaper image archives in Denmark. Concludes that it is necessary that the indexing represent both a factual and an expressional…

  17. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    Science.gov (United States)

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  18. An Empirical Analysis of Life Jacket Effectiveness in Recreational Boating.

    Science.gov (United States)

    Viauroux, Christelle; Gungor, Ali

    2016-02-01

    This article gives a measure of life jacket (LJ) effectiveness in U.S. recreational boating. Using the U.S. Coast Guard's Boating Accident Report Database from 2008 to 2011, we find that LJ wear is one of the most important determinants influencing the number of recreational boating fatalities, together with the number of vessels involved, and the type and engine of the vessel(s). We estimate a decrease in the number of deceased per vessel of about 80% when the operator wears their LJs compared to when they do not. The odds of dying are 86% higher than average if the accident involves a canoe or kayak, but 80% lower than average when more than one vessel is involved in the accident and 34% lower than average when the operator involved in the accident has more than 100 hours of boating experience. Interestingly, we find that LJ effectiveness decreases significantly as the length of the boat increases and decreases slightly as water temperature increases. However, it increases slightly as the operator's age increases. We find that between 2008 and 2011, an LJ regulation that requires all operators to wear their LJs-representing a 20% increase in wear rate-would have saved 1,721 (out of 3,047) boaters or 1,234 out of 2,185 drowning victims. The same policy restricted to boats 16-30 feet in length would have saved approximately 778 victims. Finally, we find that such a policy would reduce the percentage of drowning victims compared to other causes of death. © 2015 Society for Risk Analysis.

  19. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  20. An Empirical Analysis of Rough Set Categorical Clustering Techniques.

    Science.gov (United States)

    Uddin, Jamal; Ghazali, Rozaida; Deris, Mustafa Mat

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.

  1. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  2. Social vulnerability as criteria used in student assistance policy: an analysis of conceptual and empirical

    Directory of Open Access Journals (Sweden)

    Junia Zacour del Giúdice

    2014-12-01

    Full Text Available The criterion of social vulnerability is embedded in different programs for the analysis of poverty and social exclusion. For the present study, we sought to examine conceptually and empirically, the criterion of social vulnerability adopted in student assistance from the Federal University of Viçosa / MG for selection of students benefited by means of literature and questionnaire. Related to social vulnerability and social exclusion risks and addressed the students' perception of the subject. We conclude that the empirical approach refers to the conceptual, social vulnerability being concentrated on economic and social indicators, represented by income, ownership of assets, work, health and family structure.

  3. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...

  4. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  5. Correlated Non-Parametric Latent Feature Models

    CERN Document Server

    Doshi-Velez, Finale

    2012-01-01

    We are often interested in explaining data through a set of hidden factors or features. When the number of hidden features is unknown, the Indian Buffet Process (IBP) is a nonparametric latent feature model that does not bound the number of active features in dataset. However, the IBP assumes that all latent features are uncorrelated, making it inadequate for many realworld problems. We introduce a framework for correlated nonparametric feature models, generalising the IBP. We use this framework to generate several specific models and demonstrate applications on realworld datasets.

  6. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major...

  7. Cost Minimization Analysis of Empiric Antibiotic Used by Sepsis Patient Respiratory Infection Source

    Directory of Open Access Journals (Sweden)

    Okky S. Purwanti

    2014-03-01

    Full Text Available Empirical antibiotics plays an important role in the therapy of sepsis. The aims of this study was to estimate and compare the cost of treating inpatient sepsis with respiratory infection, with cefotaximemetronidazole or cefotaxime-erythromycin antibiotics. Observational study of cost minimization analysis was conducted by retrospective data from 2010 until 2012. Data were collected from medical records of inpatients sepsis with respiratory infection and received empirical therapy cefotaximemetronidazole or cefotaxime-erythromycin and treatment’s pricelist from department of accounting. Direct medical cost was calculated from empirical antibiotic costs, costs of medical treatment, medical expenses, hospitalization costs, and administrative costs. The study considered the cost from preadmission because sepsis until the patient was fully recovered of sepsis. Cefotaxime-metronidazole and cefotaxime-erythromycin are assumed to have equivalent efficacy. Patients with empirical cefotaxime-metronidazole were found have longer length of stay (25 versus 11 and average total cost of treatment was cheaper (16.641.112,04 IDR versus 21.641.678,02 IDR. The findings demonstrate that combination of empirical antibiotic of cefotaxime–metronidazole is more efficient than cefotaxime-erythromycin.

  8. Nonparametric Bayesian inference of the microcanonical stochastic block model

    CERN Document Server

    Peixoto, Tiago P

    2016-01-01

    A principled approach to characterize the hidden modular structure of networks is to formulate generative models, and then infer their parameters from data. When the desired structure is composed of modules or "communities", a suitable choice for this task is the stochastic block model (SBM), where nodes are divided into groups, and the placement of edges is conditioned on the group memberships. Here, we present a nonparametric Bayesian method to infer the modular structure of empirical networks, including the number of modules and their hierarchical organization. We focus on a microcanonical variant of the SBM, where the structure is imposed via hard constraints. We show how this simple model variation allows simultaneously for two important improvements over more traditional inference approaches: 1. Deeper Bayesian hierarchies, with noninformative priors replaced by sequences of priors and hyperpriors, that not only remove limitations that seriously degrade the inference on large networks, but also reveal s...

  9. Determinants of Human Development Index: A Cross-Country Empirical Analysis

    OpenAIRE

    Shah, Smit

    2016-01-01

    The Human Development Index is a statistical tool used to measure countries overall achievements in its social and economic dimensions. This paper tried to find out major factors affecting Human Development Index like health index, education index and income index. The objective of this study is found out the empirical findings and trend of human development across countries, regression analysis of determinants factors and region wise analysis of human development index.

  10. Pop-culture as a factor of Socialization: the Opportunities for Empirical Analysis

    Directory of Open Access Journals (Sweden)

    I V Trotsuk

    2010-03-01

    Full Text Available The article offers the systematic analysis of the issues related to the sociological research of popular culture at theoretical and empirical levels. The former level of analysis deals with the definitions of pop-culture in comparison with the closely related concept of mass culture as well as interdisciplinary endeavours to conceptualize the range of topics related to popular culture. Furthermore, the functions of popular culture as well as its socialization opportunities (both positive and negative are outlined. As far as the empirical analysis is concerned, the above-mentioned issues have yet received little attention. The sociological analysis tools usually comprise nothing but youth leisure preferences and value orientations theme-based modules which are not infrequently supposed to confirm the negative influence of television on socialization. The authors put forward another approach to the empirical study of the impact of popular culture with the focus on the analysis of identification models represented in «texts» of popular culture. An example illustrating the application of the given approach (the content-analysis of the youth magazine «Molotok» is provided in this very item.

  11. Pharmacoeconomic analysis of voriconazole vs. caspofungin in the empirical antifungal therapy of febrile neutropenia in Australia.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Liew, Danny; Stewart, Kay; Kong, David C M

    2012-05-01

    In two major clinical trials, voriconazole and caspofungin were recommended as alternatives to liposomal amphotericin B for empirical use in febrile neutropenia. This study investigated the health economic impact of using voriconazole vs. caspofungin in patients with febrile neutropenia. A decision analytic model was developed to measure downstream consequences of empirical antifungal therapy. Clinical outcomes measured were success, breakthrough infection, persistent base-line infection, persistent fever, premature discontinuation and death. Treatment transition probabilities and patterns were directly derived from data in two relevant randomised controlled trials. Resource use was estimated using an expert clinical panel. Cost inputs were obtained from latest Australian sources. The analysis adopted the perspective of the Australian hospital system. The use of caspofungin led to a lower expected mean cost per patient than voriconazole (AU$40,558 vs. AU$41,356), with a net cost saving of AU$798 (1.9%) per patient. Results were most sensitive to the duration of therapy and the alternative therapy used post-discontinuation. In uncertainty analysis, the cost associated with caspofungin is less than that with voriconazole in 65.5% of cases. This is the first economic evaluation of voriconazole vs. caspofungin for empirical therapy. Caspofungin appears to have a higher probability of having cost-savings than voriconazole for empirical therapy. The difference between the two medications does not seem to be statistically significant however.

  12. Thirty years of nonparametric item response theory

    NARCIS (Netherlands)

    Molenaar, W.

    2001-01-01

    Relationships between a mathematical measurement model and its real-world applications are discussed. A distinction is made between large data matrices commonly found in educational measurement and smaller matrices found in attitude and personality measurement. Nonparametric methods are evaluated fo

  13. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  14. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the trea

  15. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the

  16. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  17. Asymmetry Effects in Chinese Stock Markets Volatility: A Generalized Additive Nonparametric Approach

    OpenAIRE

    Hou, Ai Jun

    2007-01-01

    The unique characteristics of the Chinese stock markets make it difficult to assume a particular distribution for innovations in returns and the specification form of the volatility process when modeling return volatility with the parametric GARCH family models. This paper therefore applies a generalized additive nonparametric smoothing technique to examine the volatility of the Chinese stock markets. The empirical results indicate that an asymmetric effect of negative news exists in the Chin...

  18. Detection of Decreasing Vegetation Cover Based on Empirical Orthogonal Function and Temporal Unmixing Analysis

    Directory of Open Access Journals (Sweden)

    Di Xu

    2017-01-01

    Full Text Available Vegetation plays an important role in the energy exchange of the land surface, biogeochemical cycles, and hydrological cycles. MODIS (MODerate-resolution Imaging Spectroradiometer EVI (Enhanced Vegetation Index is considered as a quantitative indicator for examining dynamic vegetation changes. This paper applied a new method of integrated empirical orthogonal function (EOF and temporal unmixing analysis (TUA to detect the vegetation decreasing cover in Jiangsu Province of China. The empirical orthogonal function (EOF statistical results provide vegetation decreasing/increasing trend as prior information for temporal unmixing analysis. Temporal unmixing analysis (TUA results could reveal the dominant spatial distribution of decreasing vegetation. The results showed that decreasing vegetation areas in Jiangsu are distributed in the suburbs and newly constructed areas. For validation, the vegetation’s decreasing cover is revealed by linear spectral mixture from Landsat data in three selected cities. Vegetation decreasing areas pixels are also calculated from land use maps in 2000 and 2010. The accuracy of integrated empirical orthogonal function and temporal unmixing analysis method is about 83.14%. This method can be applied to detect vegetation change in large rapidly urbanizing areas.

  19. New method for nonlinear and nonstationary time series analysis: empirical mode decomposition and Hilbert spectral analysis

    Science.gov (United States)

    Huang, Norden E.

    2000-04-01

    A new method for analyzing nonlinear and nonstationary data has been developed. The key pat of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is define das any function having the same numbers of zero- crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of het data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the IMF yield instantaneous frequencies as functions of time that give sharp identifications of embedded structures. The final presentation of the result is an energy-frequency-time distribution, designated as the Hilbert Spectrum. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  20. Combining forecasts in short term load forecasting: Empirical analysis and identification of robust forecaster

    Indian Academy of Sciences (India)

    YOGESH K BICHPURIYA; S A SOMAN; A SUBRAMANYAM

    2016-10-01

    We present an empirical analysis to show that combination of short term load forecasts leads to better accuracy. We also discuss other aspects of combination, i.e.,distribution of weights, effect of variation in the historical window and distribution of forecast errors. The distribution of forecast errors is analyzed in order to get a robust forecast. We define a robust forecaster as one which has consistency in forecast accuracy, lesser shocks (outliers) and lower standard deviation in the distribution of forecast errors. We propose a composite ranking (CRank) scheme based on a composite score which considers three performance measures—standard deviation, kurtosis of distribution of forecast errors and accuracy of forecasts. The CRank helps in identification of a robust forecasts given a choice of individual and combined forecaster. The empirical analysis has been done with the real life data sets of two distribution companies in India.

  1. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  2. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  3. EMPIRICAL VERIFICATION OF ANISOTROPIC HYDRODYNAMIC TRAFFIC MODEL IN TRAFFIC ANALYSIS AT INTERSECTIONS IN URBAN AREA

    Institute of Scientific and Technical Information of China (English)

    WEI Yan-fang; GUO Si-ling; XUE Yu

    2007-01-01

    In this article, the traffic hydrodynamic model considering the driver's reaction time was applied to the traffic analysis at the intersections on real roads. In the numerical simulation with the model, the pinch effect of the right-turning vehicles flow was found, which mainly leads to traffic jamming on the straight lane. All of the results in accordance with the empirical data confirm the applicability of this model.

  4. SEARCH COST REDUCTION INCREASES VARIATION IN HOTELS OCCUPANCY RATE: A THEORETICAL AND EMPIRICAL ANALYSIS

    OpenAIRE

    Marianna Succurro; Federico Boffa

    2010-01-01

    This study explores how direct online booking affects the variation in hotel bed-places occupancy rate between peak and off-peak periods, thereby contributing to three strands of literature, respectively the determinants of seasonality, the tourist information acquisition process and the impact of the internet on tourism. The empirical analysis, covering 18 countries over the 1997-2007 period, investigates the impact of an increase in the use of the internet by consumers on the seasonal varia...

  5. Are Public-Private Partnerships a Source of Greater Efficiency in Water Supply? Results of a Non-Parametric Performance Analysis Relating to the Italian Industry

    Directory of Open Access Journals (Sweden)

    Corrado lo Storto

    2013-12-01

    Full Text Available This article reports the outcome of a performance study of the water service provision industry in Italy. The study evaluates the efficiency of 21 “private or public-private” equity and 32 “public” equity water service operators and investigates controlling factors. In particular, the influence that the operator typology and service management nature - private vs. public - has on efficiency is assessed. The study employed a two-stage Data Envelopment Analysis methodology. In the first stage, the operational efficiency of water supply operators is calculated by implementing a conventional BCC DEA model, that uses both physical infrastructure and financial input and output variables to explore economies of scale. In the second stage, bootstrapped DEA and Tobit regression are performed to estimate the influence that a number of environmental factors have on water supplier efficiency. The results show that the integrated water provision industry in Italy is characterized by operational inefficiencies of service operators, and scale and agglomeration economies may have a not negligible effect on efficiency. In addition, the operator typology and its geographical location affect efficiency.

  6. [Nonparametric method of estimating survival functions containing right-censored and interval-censored data].

    Science.gov (United States)

    Xu, Yonghong; Gao, Xiaohuan; Wang, Zhengxi

    2014-04-01

    Missing data represent a general problem in many scientific fields, especially in medical survival analysis. Dealing with censored data, interpolation method is one of important methods. However, most of the interpolation methods replace the censored data with the exact data, which will distort the real distribution of the censored data and reduce the probability of the real data falling into the interpolation data. In order to solve this problem, we in this paper propose a nonparametric method of estimating the survival function of right-censored and interval-censored data and compare its performance to SC (self-consistent) algorithm. Comparing to the average interpolation and the nearest neighbor interpolation method, the proposed method in this paper replaces the right-censored data with the interval-censored data, and greatly improves the probability of the real data falling into imputation interval. Then it bases on the empirical distribution theory to estimate the survival function of right-censored and interval-censored data. The results of numerical examples and a real breast cancer data set demonstrated that the proposed method had higher accuracy and better robustness for the different proportion of the censored data. This paper provides a good method to compare the clinical treatments performance with estimation of the survival data of the patients. This pro vides some help to the medical survival data analysis.

  7. A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.

    Science.gov (United States)

    Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi

    2017-01-01

    We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females.

  8. Nonparametric tests for pathwise properties of semimartingales

    CERN Document Server

    Cont, Rama; 10.3150/10-BEJ293

    2011-01-01

    We propose two nonparametric tests for investigating the pathwise properties of a signal modeled as the sum of a L\\'{e}vy process and a Brownian semimartingale. Using a nonparametric threshold estimator for the continuous component of the quadratic variation, we design a test for the presence of a continuous martingale component in the process and a test for establishing whether the jumps have finite or infinite variation, based on observations on a discrete-time grid. We evaluate the performance of our tests using simulations of various stochastic models and use the tests to investigate the fine structure of the DM/USD exchange rate fluctuations and SPX futures prices. In both cases, our tests reveal the presence of a non-zero Brownian component and a finite variation jump component.

  9. Nonparametric Transient Classification using Adaptive Wavelets

    CERN Document Server

    Varughese, Melvin M; Stephanou, Michael; Bassett, Bruce A

    2015-01-01

    Classifying transients based on multi band light curves is a challenging but crucial problem in the era of GAIA and LSST since the sheer volume of transients will make spectroscopic classification unfeasible. Here we present a nonparametric classifier that uses the transient's light curve measurements to predict its class given training data. It implements two novel components: the first is the use of the BAGIDIS wavelet methodology - a characterization of functional data using hierarchical wavelet coefficients. The second novelty is the introduction of a ranked probability classifier on the wavelet coefficients that handles both the heteroscedasticity of the data in addition to the potential non-representativity of the training set. The ranked classifier is simple and quick to implement while a major advantage of the BAGIDIS wavelets is that they are translation invariant, hence they do not need the light curves to be aligned to extract features. Further, BAGIDIS is nonparametric so it can be used for blind ...

  10. Application analysis of empirical mode decomposition and phase space reconstruction in dam time-varying characteristic

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In view of some courses of the time-varying characteristics processing in the analysis of dam deformation,the paper proposes a new method to analyze the dam time-varying characteristic based on the empirical mode decomposition and phase space reconstruction theory.First of all,to reduce the influences on the traditional statistical model from human factors and assure the analysis accuracy,response variables of the time-varying characteristic are obtained by the way of the empirical mode decomposition;and then,a phase plane of those variables is reconstructed to investigate their processing rules.These methods have already been applied to an actual project and the results showed that data interpretation with the assists of empirical mode decomposition and phase space reconstruction is effective in analyzing the perturbations of response variables,explicit in reflecting the entire development process,and valid for obtaining the evolution rules of the time-varying characteristic.This methodology is a powerful technical support for people to further master the rules of dam operation.

  11. Development of phonological awareness in Down syndrome: A meta-analysis and empirical study.

    Science.gov (United States)

    Næss, Kari-Anne B

    2016-02-01

    Phonological awareness (PA) is the knowledge and understanding of the sound structure of language and is believed to be an important skill for the development of reading. This study explored PA skills in children with Down syndrome and matched typically developing (TD) controls using a dual approach: a meta-analysis of the existing international literature and a longitudinal empirical study. The results from both the meta-analysis and the empirical study showed that the children with Down syndrome initially had weaker PA skills compared to the controls; in particular, the awareness of rhyme was delayed. The longitudinal empirical data indicated that, as a result of formal education, the children with Down syndrome exhibited greater improvement on all PA measures compared with the controls who had not yet entered school. The results reach significance for rhyme awareness. With respect to dimensionality, the performance of the children with Down syndrome loaded on 1 factor, whereas the performance of the younger TD controls was multidimensional. In sum, these findings underline the need for studies that compare interventions designed especially to stimulate development of PA in this group of children and to provide insight into the underlying causes of the developmental profile of children with Down syndrome. PsycINFO Database Record (c) 2016 APA, all rights reserved.

  12. Bayesian nonparametric estimation for Quantum Homodyne Tomography

    OpenAIRE

    Naulet, Zacharie; Barat, Eric

    2016-01-01

    We estimate the quantum state of a light beam from results of quantum homodyne tomography noisy measurements performed on identically prepared quantum systems. We propose two Bayesian nonparametric approaches. The first approach is based on mixture models and is illustrated through simulation examples. The second approach is based on random basis expansions. We study the theoretical performance of the second approach by quantifying the rate of contraction of the posterior distribution around ...

  13. Canonical Least-Squares Monte Carlo Valuation of American Options: Convergence and Empirical Pricing Analysis

    Directory of Open Access Journals (Sweden)

    Xisheng Yu

    2014-01-01

    Full Text Available The paper by Liu (2010 introduces a method termed the canonical least-squares Monte Carlo (CLM which combines a martingale-constrained entropy model and a least-squares Monte Carlo algorithm to price American options. In this paper, we first provide the convergence results of CLM and numerically examine the convergence properties. Then, the comparative analysis is empirically conducted using a large sample of the S&P 100 Index (OEX puts and IBM puts. The results on the convergence show that choosing the shifted Legendre polynomials with four regressors is more appropriate considering the pricing accuracy and the computational cost. With this choice, CLM method is empirically demonstrated to be superior to the benchmark methods of binominal tree and finite difference with historical volatilities.

  14. Theoretical and empirical applications of petroleum production function framework for analysis of the Phenomenon of Plenty

    Directory of Open Access Journals (Sweden)

    Sergey Uzhegov

    2011-09-01

    Full Text Available The current study examines how analysis of the Phenomenon of Plenty, paradox of economic underperformance of resource-rich nations, could benefit from theoretical and empirical application of suggested petroleum production function framework, basing on sample oil-abundant countries of the Commonwealth of Independent States, in particular Russia, Azerbaijan, and Kazakhstan. Proposed approach displays capacity of oil-economy production function to shed light on larger scope of theoretical issues. Empirical testing of suggested theoretical framework exhibited ability of proxied components of devised production function, capturing main metrics of the Phenomenon of Plenty and additionally factoring in corruption, to exert a strong impact on the majority of twelve principal macroeconomic indicators monitored by CIS supra-national institutions: with most pronounced influence on gross domestic product, industrial production, capital investments, and export to CIS countries.

  15. Relative prices and economic development: an analysis of the empirical evidence

    Directory of Open Access Journals (Sweden)

    P. ERCOLANI

    2013-12-01

    Full Text Available The paper examines empirical evidence on the evolution of relative prices and the long-run differences between countries with different levels of per-capita income. The price indicators employed are derived from national accounts aggregates, with the aim of drawing useful information on the relationship between changes in the structure of production and economic development. After a brief review of the literature, long-term data of eight developed countries is examined, as well as cross-sectional data from 1975 of a large group of countries with different levels of development. Some limitations of previous analyses are then presented and direct indications are advanced to explain the empirical evidence. Finally, the author highlights the consequences that international differences in relative prices entail in the cross-section analysis of the sectoral distribution of production.

  16. Kernel empirical orthogonal function analysis of 1992-2008 global sea surface height anomaly data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Andersen, Ole Baltazar; Knudsen, Per

    2009-01-01

    This paper describes a kernel version of empirical orthogonal function (EOF) analysis and its application to detect patterns of interest in global monthly mean sea surface height (SSH) anomalies from satellite altimetry acquired during the last 17 years. EOF analysis like principal component...... to large scale ocean currents and particularly to the pulsing of the El Niño/Southern Oscillation. Large scale ocean events associated with the El Niño/Southern Oscillation related signals are conveniently concentrated in the first SSH EOF modes. A major difference between the classical linear EOF...

  17. A Price Index Model for Road Freight Transportation and Its Empirical analysis in China

    Directory of Open Access Journals (Sweden)

    Liu Zhishuo

    2017-01-01

    Full Text Available The aim of price index for road freight transportation (RFT is to reflect the changes of price in the road transport market. Firstly, a price index model for RFT based on the sample data from Alibaba logistics platform is built. This model is a three levels index system including total index, classification index and individual index and the Laspeyres method is applied to calculate these indices. Finally, an empirical analysis of the price index for RFT market in Zhejiang Province is performed. In order to demonstrate the correctness and validity of the exponential model, a comparative analysis with port throughput and PMI index is carried out.

  18. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  19. Analysis of acquisition patterns : A theoretical and empirical evaluation of alternative methods

    NARCIS (Netherlands)

    Paas, LJ; Molenaar, IW

    The order in which consumers acquire nonconsumable products, such as durable and financial products, provides key information for marketing activities, for example, cross-sell lead generation. This paper advocates the desirable features of nonparametric scaling for analyzing acquisition patterns. We

  20. Analysis of acquisition patterns : A theoretical and empirical evaluation of alternative methods

    NARCIS (Netherlands)

    Paas, LJ; Molenaar, IW

    2005-01-01

    The order in which consumers acquire nonconsumable products, such as durable and financial products, provides key information for marketing activities, for example, cross-sell lead generation. This paper advocates the desirable features of nonparametric scaling for analyzing acquisition patterns. We

  1. A regional comparative analysis of empirical and theoretical flood peak-volume relationships

    Directory of Open Access Journals (Sweden)

    Szolgay Ján

    2016-12-01

    Full Text Available This paper analyses the bivariate relationship between flood peaks and corresponding flood event volumes modelled by empirical and theoretical copulas in a regional context, with a focus on flood generation processes in general, the regional differentiation of these and the effect of the sample size on reliable discrimination among models. A total of 72 catchments in North-West of Austria are analysed for the period 1976–2007. From the hourly runoff data set, 25 697 flood events were isolated and assigned to one of three flood process types: synoptic floods (including long- and short-rain floods, flash floods or snowmelt floods (both rain-on-snow and snowmelt floods. The first step of the analysis examines whether the empirical peak-volume copulas of different flood process types are regionally statistically distinguishable, separately for each catchment and the role of the sample size on the strength of the statements. The results indicate that the empirical copulas of flash floods tend to be different from those of the synoptic and snowmelt floods. The second step examines how similar are the empirical flood peak-volume copulas between catchments for a given flood type across the region. Empirical copulas of synoptic floods are the least similar between the catchments, however with the decrease of the sample size the difference between the performances of the process types becomes small. The third step examines the goodness-of-fit of different commonly used copula types to the data samples that represent the annual maxima of flood peaks and the respective volumes both regardless of flood generating processes (the traditional engineering approach and also considering the three process-based classes. Extreme value copulas (Galambos, Gumbel and Hüsler-Reiss show the best performance both for synoptic and flash floods, while the Frank copula shows the best performance for snowmelt floods. It is concluded that there is merit in treating flood

  2. A LONGITUDINAL ANALYSIS REGARDING THE EVOLUTION OF PROFIT TAX REGULATIONS IN ROMANIA AN EMPIRICAL VIEW

    Directory of Open Access Journals (Sweden)

    Albu Nadia

    2011-07-01

    Full Text Available The study conducted a longitudinal analysis regarding Romanian profit tax regulations. Beginning with the first profit tax regulation implemented in 1991 and until now, we analyzed based on a empirical approach all changes that have occurred over time in the Romanian accounting environment. The motivation of the study conducted was based on the strong relationship between accounting and taxation in the Romanian accounting environment over time, the profit tax being one of the main items of this relation. This particular study is divided into five sections. After a short introduction and presenting the motivation of the study (section 1, in section 2 we conducted the literature review based on international and national studies regarding the profit tax regulations through the relationship between accounting and taxation. Section 3 presents a brief review of the main Romanian regulations that concerned the profit tax and the most important changes that have occurred over time. In section 4 we conducted the empirical analysis. In this section is realized a series of analysis, aiming the following: (1 the total number of regulations that have amend the main regulations presented in the previous section; (2 the type of amendments implemented over regulations (abolishment, text amendment, adding new articles or alignments; (3 the total number of amendments approved by law without modifications, respectively the total number of amendments approved on the Official Journal through Government Ordinance or Emergency Ordinance and unapproved by law. The empirical analysis conducted documented that the main shortcoming associated with the profit tax regulation is due by the multiple changes which have been subject of the 5 main profit tax regulations. The last section (section 5 consists in presenting the conclusions of the study. As main conclusion, the profit tax regulation is stable only in terms of the small number of main regulations, the large number

  3. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    Energy Technology Data Exchange (ETDEWEB)

    Walls, W.D., E-mail: wdwalls@ucalgary.ca [Department of Economics, University of Calgary, 2500 University Drive NW, Calgary, Alberta, T2N 1N4 (Canada); Rusco, Frank; Kendix, Michael [US GAO (United States)

    2011-07-15

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research Highlights: > Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. > This has important implications for motor fuels markets and vehicular emissions. > Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city

  4. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  5. Performances and Spending Efficiency in Higher Education: A European Comparison through Non-Parametric Approaches

    Science.gov (United States)

    Agasisti, Tommaso

    2011-01-01

    The objective of this paper is an efficiency analysis concerning higher education systems in European countries. Data have been extracted from OECD data-sets (Education at a Glance, several years), using a non-parametric technique--data envelopment analysis--to calculate efficiency scores. This paper represents the first attempt to conduct such an…

  6. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  7. An Empirical Analysis of the Changing Role of the German Bundesbank after 1983

    DEFF Research Database (Denmark)

    Juselius, Katarina

    A cointegrated VAR model describing a small macroeconomic system consisting of money, income, prices, and interest rates is estimated on split sample data before and after 1983. The monetary mechanisms were found to be significantly different. Before 1983, the money supply seemed controlable...... and expansion or contraction of money supply had the expected effect on prices, income, and interest rates. After 1983, the conventional mechanisms no longer seemed to work. The empirical analysis pointed to the crucial role of the bond rate in the system, particularly for the more recent period...

  8. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual......Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases...

  9. Empirical Requirements Analysis for Mars Surface Operations Using the Flashline Mars Arctic Research Station

    Science.gov (United States)

    Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)

    2001-01-01

    Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.

  10. An Empirical Analysis of Farmers’ Rabbit Breeds Purchase and Its Influencing Factors

    Institute of Scientific and Technical Information of China (English)

    Yuhe; SONG; Laping; WU

    2014-01-01

    In this paper,based on the survey data on farmers in 14 provinces and cities nationwide provided by China Rabbit Research System,we analyze the farmers’ rabbit breeds selection,purchase channels and the demand for new varieties of rabbits as well as the problems in the course of rabbit usage. We make an empirical analysis of the factors influencing farmers’ rabbit demand,and put forth the recommendations for farmers’ rabbit breeds usage and to improve the promotion of new varieties of rabbits.

  11. Coding and data analysis during qualitative empirical research in Practical Theology

    Directory of Open Access Journals (Sweden)

    Petria M. Theron

    2015-02-01

    Full Text Available I dedicate this article to Prof. George Lotter, who has been instrumental in the formation of more than 90 postgraduate students in practical theological studies at the North-West University (NWU. Under his guidance, a significant amount of empirical research has been conducted. This is in line with a movement among scholars, both national and international, towards a more empirical approach in Practical Theology. It is therefore indispensable that both lecturers and students in Practical Theology should further develop their empirical research capacities. In this article, it is argued for a more systematic approach during the coding and data analysis phase of qualitative research and the article concludes with a proposed model for coding and data analysis in practical theological studies.Kodering en data-analise tydens kwalitatiewe empiriese navorsing in Praktiese Teologie. Hierdie artikel word aan prof. George Lotter opgedra. Deur die jare was hy instrumenteel in die vorming van meer as 90 nagraadse studente in Praktiese Teologie by die Noord-Wes Universiteit (NWU. Baie empiriese navorsing het onder sy leiding plaasgevind. Dit is in ooreenstemming met die nasionale en internasionale tendens van ’n meer empiriese benadering in Praktiese Teologie. Dit is dus van die uiterste belang dat Praktiese Teologie-dosente en -studente se vaardighede in empiriese navorsing verder ontwikkel moet word. In hierdie artikel word aangetoon dat ’n meer sistematiese benadering gevolg moet word tydens die kodering en die data-analisefase van kwalitatiewe navorsing. ’n Model vir die kodering en data-analise vir navorsing in Praktiese Teologie word ook voorgestel.

  12. Comparison of Electric Utility Restructuring in Major Foreign Countries: Empirical Analysis on the Pool Structure

    Energy Technology Data Exchange (ETDEWEB)

    Yun, W.C. [Korea Energy Economics Institute, Euiwang (Korea)

    2001-12-01

    The object of this study is to suggest policy directions about the market structure and the trading system of TWBP (two-way bidding pool) under the ongoing electric power restructuring scheme in Korea. For this purpose, major foreign countries' restructuring experiences, especially the market forms and trading systems, were reviewed. Based on the review, I suggested the introduction of electricity contract markets and the short- and long-term policy alternatives for adopting them. In order to test the feasibility of the policy alternatives, an empirical analysis was performed on the comparative advantage between gross pool and net pool. The data used in the empirical analysis includes simulation results of year 2000 from KPX (Korea Power Exchange), six generators' fuels costs, etc. A supply function model was modified to an operational model to calculate the generators' bidding prices and quantities, and their variable profits. In addition, the effects of introducing contracts for differences (CfDs) and bilateral contracts were analyzed in terms of market efficiency. In this study, the criteria of market efficiency are the means and the standard deviations of generators' bidding prices and their variable profits. Based on the empirical results, the following policy implications were proposed. As a short-term alternative, it would be necessary to incorporate the concept of hedging contracts into current gross pool mechanism. Related to this, through the promotion of contract markets it should be induced to decrease the general levels of bidding prices and to manage the volatility of market prices. As a long-term alternative, the possibility to transfer gross pool into net pool should be considered. (author). 53 refs., 32 figs., 38 tabs.

  13. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are pointed...... out, and methods to prevent bias are presented. The techniques are evaluated by comparing their speed and accuracy on the simple case of estimating auto-correlation functions for the response of a single degree-of-freedom system loaded with white noise....

  14. Nonparametric inferences for kurtosis and conditional kurtosis

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-heng; HE You-hua

    2009-01-01

    Under the assumption of strictly stationary process, this paper proposes a nonparametric model to test the kurtosis and conditional kurtosis for risk time series. We apply this method to the daily returns of S&P500 index and the Shanghai Composite Index, and simulate GARCH data for verifying the efficiency of the presented model. Our results indicate that the risk series distribution is heavily tailed, but the historical information can make its future distribution light-tailed. However the far future distribution's tails are little affected by the historical data.

  15. Preliminary results on nonparametric facial occlusion detection

    Directory of Open Access Journals (Sweden)

    Daniel LÓPEZ SÁNCHEZ

    2016-10-01

    Full Text Available The problem of face recognition has been extensively studied in the available literature, however, some aspects of this field require further research. The design and implementation of face recognition systems that can efficiently handle unconstrained conditions (e.g. pose variations, illumination, partial occlusion... is still an area under active research. This work focuses on the design of a new nonparametric occlusion detection technique. In addition, we present some preliminary results that indicate that the proposed technique might be useful to face recognition systems, allowing them to dynamically discard occluded face parts.

  16. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian...... updating approach and be integrated in the reliability analysis by a third-order polynomial chaos expansion approximation. Although Classical Bayesian updating approaches are often used because of its parametric formulation, non-parametric approaches are better alternatives for multi-parametric updating...... with a non-conjugating formulation. The results in this paper show the influence on the time dependent updated reliability when non-parametric and classical Bayesian approaches are used. Further, the influence on the reliability of the number of updated parameters is illustrated....

  17. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  18. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    Science.gov (United States)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  19. Chinese Stock Index Futures Price Fluctuation Analysis and Prediction Based on Complementary Ensemble Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Ruoyang Chen

    2016-01-01

    Full Text Available Since the CSI 300 index futures officially began trading on April 15, 2010, analysis and predictions of the price fluctuations of Chinese stock index futures prices have become a popular area of active research. In this paper, the Complementary Ensemble Empirical Mode Decomposition (CEEMD method is used to decompose the sequences of Chinese stock index futures prices into residue terms, low-frequency terms, and high-frequency terms to reveal the fluctuation characteristics over different time scales of the sequences. Then, the CEEMD method is combined with the Particle Swarm Optimization (PSO algorithm-based Support Vector Machine (SVM model to forecast Chinese stock index futures prices. The empirical results show that the residue term determines the long-term trend of stock index futures prices. The low-frequency term, which represents medium-term price fluctuations, is mainly affected by policy regulations under the analysis of the Iterated Cumulative Sums of Squares (ICSS algorithm, whereas short-term market disequilibrium, which is represented by the high-frequency term, plays an important local role in stock index futures price fluctuations. In addition, in forecasting the daily or even intraday price data of Chinese stock index futures, the combination prediction model is superior to the single SVM model, which implies that the accuracy of predicting Chinese stock index futures prices will be improved by considering fluctuation characteristics in different time scales.

  20. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  1. Partnership effectiveness in primary community care networks: A national empirical analysis of partners' coordination infrastructure designs.

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Lin, Yung-Kai; Lin, Cheng-Chieh

    2010-01-01

    Previous empirical and managerial studies have ignored the effectiveness of integrated health networks. It has been argued that the varying definitions and strategic imperatives of integrated organizations may have complicated the assessment of the outcomes/performance of varying models, particularly when their market structures and contexts differed. This study aimed to empirically verify a theoretical perspective on the coordination infrastructure designs and the effectiveness of the primary community care networks (PCCNs) formed and funded by the Bureau of National Health Insurance since March 2003. The PCCNs present a model to replace the traditional fragmented providers in Taiwan's health care. The study used a cross-sectional mailed survey designed to ascertain partnership coordination infrastructure and integration of governance, clinical care, bonding, finances, and information. The outcome indicators were PCCNs' perceived performance and willingness to remain within the network. Structural equation modeling examined the causal relationships, controlling for organizational and environmental factors. Primary data collection occurred from February through December 2005, via structured questionnaires sent to 172 PCCNs. Using the individual PCCN as the unit of analysis, the results found that a network's efforts regarding coordination infrastructures were positively related to the PCCN's perceived performance and willingness to remain within the network. In addition, PCCNs practicing in rural areas and in areas with higher density of medical resources had better perceived effectiveness and willingness to cooperate in the network.Practical Implication: The lack of both an operational definition and an information about system-wide integration may have obstructed understanding of integrated health networks' organizational dynamics. This study empirically examined individual PCCNs and offers new insights on how to improve networks' organizational design and

  2. Analyzing single-molecule time series via nonparametric Bayesian inference.

    Science.gov (United States)

    Hines, Keegan E; Bankston, John R; Aldrich, Richard W

    2015-02-03

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  3. An empirical study of Iranian regional airports using robust data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Amin Foroughi

    2010-07-01

    Full Text Available Data Envelopment Analysis (DEA has been one of the most important tools on measuring the relative efficiency of different similar units such as transportation systems using terminals, airports, etc. In this study, we perform an empirical analysis on Iranian airports based on DEA methods to measure the efficiencies of various airports. One of the primary issues on many traditional DEA methods is that the data are almost always contaminated with noise. We use a DEA method which could handle the uncertainty associated with input and output data. The results of this comprehensive study show that most of the active airlines are practically inefficient and the government could significantly increase the efficiencies of the airports by setting new regulations and rules.

  4. Empirical analysis of Brazilian banks' capital buffers during the period 2001-2011

    Directory of Open Access Journals (Sweden)

    Vinícius Cintra Belém

    2016-04-01

    Full Text Available International literature indicates that the capital buffers held by banks result notably from the trade-off that exists between the cost of holding capital, adjustment costs, and bankruptcy costs, which all have a direct impact on banks' capital structures. The aim of this paper is to study the degree of sensitivity of Brazilian banks' capital buffers to the determining factors established in the literature, by using a sample of 121 banks, covering the period from 2001 to 2011. The empirical analysis that was carried out found that there was a significant cost of adjusting capital buffers for the Brazilian banks. At the same time, bankruptcy cost indicated a positive relationship between risk profile and capital buffers, while the cost of holding capital did not exhibit statistical significance in the analysis.

  5. An Empirical Study on the Impact of Automation on the Requirements Analysis Process

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lami; Robert W. Ferguson

    2007-01-01

    Requirements analysis is an important phase in a software project. The analysis is often performed in aninformal way by specialists who review documents looking for ambiguities, technical inconsistencies and incomplete parts.Automation is still far from being applied in requirements analyses, above all since natural languages are informal andthus difficult to treat automatically. There are only a few tools that can analyse texts. One of them, called QuARS, wasdeveloped by the Istituto di Scienza e Tecnologie dell'Informazione and can analyse texts in terms of ambiguity. This paperdescribes how QuARS was used in a formal empirical experiment to assess the impact in terms of effectiveness and efficacyof the automation in the requirements review process of a software company.

  6. Updating an empirical analysis on the proton's central opacity and asymptotia

    CERN Document Server

    Fagundes, D A; Silva, P V R G

    2015-01-01

    We present an updated empirical analysis on the ratio of the elastic (integrated) to the total cross section in the c.m. energy interval from 5 GeV to 8 TeV. As in a previous work, we use a suitable analytical parametrization for that ratio (depending on only four free fit parameters) and investigate three asymptotic scenarios: either the black disk limit or scenarios above or below that limit. The dataset includes now the datum at 7 TeV, recently reported by the ATLAS Collaboration, which plays an important role in the data reductions. Our analysis favors, once more, a scenario below the black disk, providing now an asymptotic ratio consistent with the rational value 1/3, namely a gray disk limit. Upper bounds for the ratio of the diffractive (dissociative) to the inelastic cross section are also presented.

  7. An Empirical Analysis of Credit Risk Factors of the Slovenian Banking System

    Directory of Open Access Journals (Sweden)

    Boštjan Aver

    Full Text Available The study presents the results of an analysis of credit risk factors of the Slovenian banking system. The objective of the empirical analysis is to establish which macroeconomic factors influence the systematic credit risk of the Slovenian banking loan portfolio. The research results have confirmed the main hypothesis that certain macroeconomic factors have a major influence on the examined credit risk.We could conclude that the credit risk of the loan portfolio depends on the employment or unemployment rate in Slovenia, on short and long-term interest rates of Slovenian banks and the Bank of Slovenia, and on the value of the Slovenian stock exchange index. We cannot claim that the examined credit risk depends on the inflation rate in Slovenia, the growth of GDP (industrial production, EUR and USD exchange rates or the growth of Slovenian import and export.

  8. Acceptance and Commitment Therapy versus Traditional Cognitive Behavioral Therapy: A Systematic Review and Meta-analysis of Current Empirical Evidence

    OpenAIRE

    2012-01-01

    Controversy remains about the empirical status of acceptance and commitment therapy (ACT) and its presumably different characteristics relative to traditional cognitive behavioral therapy (CBT). The current study aims to shed some light in this respect by conducting a systematic review and meta-analysis of the studies that have empirically compared ACT versus CBT. Sixteen studies comparing differential outcomes (N= 954) of ACT versus CBT in diverse problems were identified following several s...

  9. Nonparametric estimation of population density for line transect sampling using FOURIER series

    Science.gov (United States)

    Crain, B.R.; Burnham, K.P.; Anderson, D.R.; Lake, J.L.

    1979-01-01

    A nonparametric, robust density estimation method is explored for the analysis of right-angle distances from a transect line to the objects sighted. The method is based on the FOURIER series expansion of a probability density function over an interval. With only mild assumptions, a general population density estimator of wide applicability is obtained.

  10. A non-parametric peak finder algorithm and its application in searches for new physics

    CERN Document Server

    Chekanov, S

    2011-01-01

    We have developed an algorithm for non-parametric fitting and extraction of statistically significant peaks in the presence of statistical and systematic uncertainties. Applications of this algorithm for analysis of high-energy collision data are discussed. In particular, we illustrate how to use this algorithm in general searches for new physics in invariant-mass spectra using pp Monte Carlo simulations.

  11. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  12. Biological parametric mapping with robust and non-parametric statistics.

    Science.gov (United States)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M; Landman, Bennett A

    2011-07-15

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, regions of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrices. Recently, biological parametric mapping has extended the widely popular statistical parametric mapping approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and non-parametric regression in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provide a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Pivotal Estimation of Nonparametric Functions via Square-root Lasso

    CERN Document Server

    Belloni, Alexandre; Wang, Lie

    2011-01-01

    In a nonparametric linear regression model we study a variant of LASSO, called square-root LASSO, which does not require the knowledge of the scaling parameter $\\sigma$ of the noise or bounds for it. This work derives new finite sample upper bounds for prediction norm rate of convergence, $\\ell_1$-rate of converge, $\\ell_\\infty$-rate of convergence, and sparsity of the square-root LASSO estimator. A lower bound for the prediction norm rate of convergence is also established. In many non-Gaussian noise cases, we rely on moderate deviation theory for self-normalized sums and on new data-dependent empirical process inequalities to achieve Gaussian-like results provided log p = o(n^{1/3}) improving upon results derived in the parametric case that required log p = O(log n). In addition, we derive finite sample bounds on the performance of ordinary least square (OLS) applied tom the model selected by square-root LASSO accounting for possible misspecification of the selected model. In particular, we provide mild con...

  14. Empirical decomposition method for modeless component and its application to VIV analysis

    Directory of Open Access Journals (Sweden)

    Chen Zheng-Shou

    2015-06-01

    Full Text Available Aiming at accurately distinguishing modeless component and natural vibration mode terms from data series of nonlinear and non-stationary processes, such as Vortex-Induced Vibration (VIV, a new empirical mode decomposition method has been developed in this paper. The key innovation related to this technique concerns the method to decompose modeless component from non-stationary process, characterized by a predetermined ‘maximum intrinsic time window’ and cubic spline. The introduction of conceptual modeless component eliminates the requirement of using spurious harmonics to represent nonlinear and non-stationary signals and then makes subsequent modal identification more accurate and meaningful. It neither slacks the vibration power of natural modes nor aggrandizes spurious energy of modeless component. The scale of the maximum intrinsic time window has been well designed, avoiding energy aliasing in data processing. Finally, it has been applied to analyze data series of vortex-induced vibration processes. Taking advantage of this newly introduced empirical decomposition method and mode identification technique, the vibration analysis about vortex-induced vibration becomes more meaningful.

  15. Empirical Risk Analysis of Severe Reactor Accidents in Nuclear Power Plants after Fukushima

    Directory of Open Access Journals (Sweden)

    Jan Christian Kaiser

    2012-01-01

    Full Text Available Many countries are reexamining the risks connected with nuclear power generation after the Fukushima accidents. To provide updated information for the corresponding discussion a simple empirical approach is applied for risk quantification of severe reactor accidents with International Nuclear and Radiological Event Scale (INES level ≥5. The analysis is based on worldwide data of commercial nuclear facilities. An empirical hazard of 21 (95% confidence intervals (CI 4; 62 severe accidents among the world’s reactors in 100,000 years of operation has been estimated. This result is compatible with the frequency estimate of a probabilistic safety assessment for a typical pressurised power reactor in Germany. It is used in scenario calculations concerning the development in numbers of reactors in the next twenty years. For the base scenario with constant reactor numbers the time to the next accident among the world's 441 reactors, which were connected to the grid in 2010, is estimated to 11 (95% CI 3.7; 52 years. In two other scenarios a moderate increase or decrease in reactor numbers have negligible influence on the results. The time to the next accident can be extended well above the lifetime of reactors by retiring a sizeable number of less secure ones and by safety improvements for the rest.

  16. Time dependent intrinsic correlation analysis of temperature and dissolved oxygen time series using empirical mode decomposition

    CERN Document Server

    Huang, Y X

    2014-01-01

    In the marine environment, many fields have fluctuations over a large range of different spatial and temporal scales. These quantities can be nonlinear \\red{and} non-stationary, and often interact with each other. A good method to study the multiple scale dynamics of such time series, and their correlations, is needed. In this paper an application of an empirical mode decomposition based time dependent intrinsic correlation, \\red{of} two coastal oceanic time series, temperature and dissolved oxygen (saturation percentage) is presented. The two time series are recorded every 20 minutes \\red{for} 7 years, from 2004 to 2011. The application of the Empirical Mode Decomposition on such time series is illustrated, and the power spectra of the time series are estimated using the Hilbert transform (Hilbert spectral analysis). Power-law regimes are found with slopes of 1.33 for dissolved oxygen and 1.68 for temperature at high frequencies (between 1.2 and 12 hours) \\red{with} both close to 1.9 for lower frequencies (t...

  17. Empirical analysis of university-industry R&D collaboration: Evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Saqib Mehmood Afzal

    2014-08-01

    Full Text Available University-industry collaboration plays a vital role in nation’s innovation system. This study presents an empirical analysis of R&D collaboration between university and industry. The literature focused on the all those factors including firm size, firm’s innovation activity and openness of the firm affecting university-industry collaboration. Primary data is used and sample contains 15 industrial sectors of Pakistan according to market capitalization at the time of data collection. The empirical results of the study suggest that firm’s size, number of employees and openness of the firm have positive impact on uni-industry collaboration for R&D projects. Whilst annual budget of the firms is found to have negative relationship with R&D collaborations and larger firms are found to be less efficient in taking advantages of R&D collaboration with universities because larger firms have more spending on their fixed costs as they have participated in so many R&D activities.

  18. Geostatistical analysis of disease data: estimation of cancer mortality risk from empirical frequencies using Poisson kriging

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2005-12-01

    Full Text Available Abstract Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight, the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the

  19. Analyzing multiple spike trains with nonparametric Granger causality.

    Science.gov (United States)

    Nedungadi, Aatira G; Rangarajan, Govindan; Jain, Neeraj; Ding, Mingzhou

    2009-08-01

    Simultaneous recordings of spike trains from multiple single neurons are becoming commonplace. Understanding the interaction patterns among these spike trains remains a key research area. A question of interest is the evaluation of information flow between neurons through the analysis of whether one spike train exerts causal influence on another. For continuous-valued time series data, Granger causality has proven an effective method for this purpose. However, the basis for Granger causality estimation is autoregressive data modeling, which is not directly applicable to spike trains. Various filtering options distort the properties of spike trains as point processes. Here we propose a new nonparametric approach to estimate Granger causality directly from the Fourier transforms of spike train data. We validate the method on synthetic spike trains generated by model networks of neurons with known connectivity patterns and then apply it to neurons simultaneously recorded from the thalamus and the primary somatosensory cortex of a squirrel monkey undergoing tactile stimulation.

  20. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  1. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes

    Science.gov (United States)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  2. Establishment of Grain Farmers’ Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Based on farmers’ supply behavior theory and price expectations theory,this paper establishes grain farmers’ supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas,to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive impact on farmers’ grain supply in the major grain producing areas. In recent years,China steadily raises the level of minimum grain purchase price,which has played an important role in effectively protecting grain farmers’ interests,mobilizing the enthusiasm of farmers’ grain production,and ensuring the market supply of key grain varieties.

  3. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    Directory of Open Access Journals (Sweden)

    Yunqiu Liang

    2013-04-01

    Full Text Available The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address their life and the characteristics of their own conditions choose to suit your participation, improve psychological and physiological health, timely adjust the mood state, to positive psychological state of work, high job satisfaction. Different organizations accord to their occupation characteristics and available resources and actively guide the organization members form a more scientific and reasonable habits and ways of life, fully mobilize the enthusiasm of participating in the fitness activity, to create a more intense fitness culture atmosphere, in order to improve internal cohesion and centripetal force, improve job satisfaction.

  4. Empirical Analysis of the Role of Urbanization in Driving the Growth of Rural Residents’ Consumption

    Institute of Scientific and Technical Information of China (English)

    Xiaofang; ZOU

    2015-01-01

    Urbanization is a powerful engine for the growth of rural residents’ consumption in China.This paper selects the cross-sectional data concerning 31 provinces( municipalities) in China during 2005-2012,and builds the panel data model of influence of urbanization on rural residents’ consumption in China for empirical analysis.The results show that there is a significant positive correlation between urbanization and rural residents’ consumption level.From the mechanism,urbanization drives the growth of rural residents’ consumption by improving rural residents’ income level and changing rural residents’ consumption concept.However,the uncertainties of rural residents’ income inhibit the growth of rural residents’ consumption.Therefore,it is necessary to accelerate the development of urbanization,broaden farmers’ income channels,improve the consumer environment and accelerate the reform of the household registration system to further activate the rural consumer market.

  5. Empirical analysis of web-based user-object bipartite networks

    CERN Document Server

    Shang, Mingsheng; Zhang, Yi-Cheng; Zhou, Tao

    2009-01-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This Letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative clustering coefficient, to quantify the clustering behavior based on the collaborative selection. Accordingly, the clustering properties and clustering-degree correlations are investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  6. Local topological charge analysis of electromagnetic vortex beam based on empirical mode decomposition

    CERN Document Server

    Hui, Xiaonan; Zhang, Weite; Jin, Xiaofeng; Chi, Hao; Zhang, Xianmin

    2015-01-01

    The topological charge of an electromagnetic vortex beam depends on its wavefront helicity. For mixed vortex beams composed of several different coaxial vortices, the topological charge spectrum can be obtained by Fourier transform. However, the vortex beam is generally divergent and imperfect. It makes it significant to investigate the local topological charges, especially in radio frequency regime. Fourier transform based methods are restrained by the uncertainty principle and cannot achieve high angular resolution and mode resolution simultaneously. In this letter, an analysis method for local topological charges of vortex beams is presented based on the empirical mode decomposition (EMD). From EMD, the intrinsic mode functions (IMFs) can be obtained to construct the bases of the electromagnetic wave, and each local topological charge can be respectively defined. With this method the local value achieves both high resolution of azimuth angle and topological charge, meanwhile the amplitudes of each OAM mode...

  7. Windfall profit in portfolio diversification? An empirical analysis of the potential benefits of renewable energy investments

    Energy Technology Data Exchange (ETDEWEB)

    Bruns, Frederik

    2013-05-01

    Modern Portfolio Theory is a theory which was introduced by Markowitz, and which suggests the building of a portfolio with assets that have low or, in the best case, negative correlation. In times of financial crises, however, the positive diversification effect of a portfolio can fail when Traditional Assets are highly correlated. Therefore, many investors search for Alternative Asset classes, such as Renewable Energies, that tend to perform independently from capital market performance. 'Windfall Profit in Portfolio Diversification?' discusses the potential role of Renewable Energy investments in an institutional investor's portfolio by applying the main concepts from Modern Portfolio Theory. Thereby, the empirical analysis uses a unique data set from one of the largest institutional investors in the field of Renewable Energies, including several wind and solar parks. The study received the Science Award 2012 of the German Alternative Investments Association ('Bundesverband Alternative Investments e.V.').

  8. Economic growth, energy demand and the environment. Empirical insights using time series and decomposition analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, Dirk C.

    2011-07-01

    Industrialization and increasing mobility in developing countries like China and India are resulting in growing energy demand. This hunger for energy is largely satisfied by fossil fuels and thus accompanied by rising emissions. This book aims at empirically giving insights about the relationship between energy consumption, economic growth and CO{sub 2} emissions using recent panel cointegration and decomposition methods. The investigation is carried out for the top energy consumers and CO{sub 2} emitters worldwide with a special emphasis on the European Union and some focus countries for the detailed analysis of the industry and transport sector. The results confirm the need for a more sustainable energy system by implementing measures of energy efficiency and reducing carbon intensity of energy supply by shifting from fossil fuels to renewable energy sources. (orig.)

  9. Empirical analysis of collective human behavior for extraordinary events in blogosphere

    CERN Document Server

    Sano, Yukie; Watanabe, Hayafumi; Takayasu, Hideki; Takayasu, Misako

    2011-01-01

    To explain collective human behavior in blogosphere, we survey more than 1.8 billion entries and observe statistical properties of word appearance. We first estimate the basic properties of number fluctuation of ordinary words that appear almost uniformly. Then, we focus on those words that show dynamic growth with a tendency to diverge on a certain day, and also news words, that are typical keywords for natural disasters, grow suddenly with the occurrence of events and decay gradually with time. In both cases, the functional forms of growth and decay are generally approximated by power laws with exponents around -1 for a period of about 80 days. Our empirical analysis can be applied for the prediction of word frequency in blogosphere.

  10. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    Science.gov (United States)

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  11. Empirical Comparison of Visualization Tools for Larger-Scale Network Analysis

    Directory of Open Access Journals (Sweden)

    Georgios A. Pavlopoulos

    2017-01-01

    Full Text Available Gene expression, signal transduction, protein/chemical interactions, biomedical literature cooccurrences, and other concepts are often captured in biological network representations where nodes represent a certain bioentity and edges the connections between them. While many tools to manipulate, visualize, and interactively explore such networks already exist, only few of them can scale up and follow today’s indisputable information growth. In this review, we shortly list a catalog of available network visualization tools and, from a user-experience point of view, we identify four candidate tools suitable for larger-scale network analysis, visualization, and exploration. We comment on their strengths and their weaknesses and empirically discuss their scalability, user friendliness, and postvisualization capabilities.

  12. Condition monitoring of a wind turbine gearbox using the empirical mode decomposition method and outlier analysis

    Energy Technology Data Exchange (ETDEWEB)

    Antoniadou, Ifigeneia; Manson, G.; Dervilis, N.; Worden, K. [Sheffield Univ. (United Kingdom); Barszcz, T.; Staszewski, W. [AGH Univ. of Science and Technology, Krakow (Poland)

    2012-07-01

    Wind turbines are subject to variable aerodynamic loads and extreme environmental conditions. Wind turbine components fail frequently, resulting in high maintenance costs. For this reason, gearbox condition monitoring becomes important since gearboxes are among the wind turbine components with the most frequent failure observations. The major challenge here is the detection of faults under the time varying operating conditions prevailing in wind turbine systems. This paper analyses wind turbine gearbox vibration data using the empirical mode decomposition method and the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. The instantaneous characteristics of the signals are obtained with the application of the Hilbert transform. The lowest level of fault detection, the threshold value, is considered and Mahalanobis squared-distance is calculated for the novelty detection problem. (orig.)

  13. Comparative Analysis of Empirical Path Loss Model for Cellular Transmission in Rivers State

    Directory of Open Access Journals (Sweden)

    B.O.H Akinwole, Biebuma J.J

    2013-08-01

    Full Text Available This paper presents a comparative analysis of three empirical path loss models with measured data for urban, suburban, and rural areas in Rivers State. The three models investigated were COST 231 Hata, SUI,ECC-33models. A downlink data was collected at operating frequency of 2100MHz using drive test procedure consisting of test mobile phones to determine the received signal power (RSCP at specified receiver distanceson a Globacom Node Bs located in some locations in the State. This test was carried out for investigating the effectiveness of the commonly used existing models for Cellular transmission. The results analysed were based on Mean Square Error (MSE and Standard Deviation (SD and were simulated on MATLAB (7.5.0. The results show that COST 231 Hata model gives better predictions and therefore recommended for path loss predictions in River State.

  14. Empirical Analysis on the Determinants of Economic Growth in Shaanxi Province, China

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Status of economic development in Shaanxi Province is analyzed, showing that Shaanxi Province has achieved the fast and stable economic growth; and total GDP and fixed assets investment have shown a sustainable growth. According to the time series statistics of Shaanxi Province in the years 1978-2008, Cobb-Douglas Function is used to carry out the empirical analysis on the contribution of fixed assets investment and labor input to economic growth of Shaanxi Province, China. Result shows that capital and labor input are the major driving forces for the economic growth of Shaanxi Province. In other words, economic growth mode of Shaanxi Province is still extensive. Economic growth of Shaanxi Province is increasingly dependent on capital investment and technological progress. Contribution rates of capital and labor to economic growth are 66.9% and 33.1%, respectively. Therefore, investment is a source of economic growth in Shaanxi Province through the reform and opening up in the last three decades.

  15. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Directory of Open Access Journals (Sweden)

    JeongYeon Kim

    2014-01-01

    Full Text Available This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm’s financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  16. An empirical analysis to study the cyclical trends on stock exchange using wavelet methods

    Directory of Open Access Journals (Sweden)

    Shapour Mohammadi

    2011-01-01

    Full Text Available During the past few decades, there have been many evidences to believe that the stock markets around the world follow cyclical trends. In this paper, we study the cyclical trends using wavelet function based on various time windows on some major stock market indices. We use two methods of Daubechies and reverse bi-orthogonal wavelet methods and determine the optimal values of both methods. The results are used for Tehran stock exchange using the most recent ten years daily information as an empirical study. The details of our analysis on TEDPIX index for the last decade indicate that there are, at least, four trends of weekly, monthly, quarterly and yearly and the cycles would be expected to be repeated in future.

  17. Regional Informatization and Economic Growth in Japan: An Empirical Study Based on Spatial Econometric Analysis

    Directory of Open Access Journals (Sweden)

    Chuan Sun

    2014-10-01

    Full Text Available Research opinion on informatization is divided between two opposite poles—that it promotes or inhibits the spillover of regional economies. These conflicting viewpoints are called “the paradoxical geographies of the digital economy”. Information-based investment and diffusion of informatization contribute to breaking the economic space constraints caused by distance, leading to interregional spillover effects, according to the results of the Durbin model of spatial lag applied to Japanese regional data. Clearly, the local direct effects and the perimeter region’s indirect effects of informatization are both positive. This proves the existence of network externality, which causes increasing returns to scale. Extensive diffusion of information technology plays a significant role in the process, in addition to rapid accumulation and infiltration of information resources, which strengthens the information-based investment spillover effect. In this empirical analysis, evidence seems to support the view that informatization promotes economic development in Japan.

  18. An Empirical Analysis of Economic and Socio-Demographic Determinants of Entrepreneurship Across German Regions

    Directory of Open Access Journals (Sweden)

    Mrożewski Matthias

    2014-11-01

    Full Text Available Entrepreneurship is fundamental for a country's economic development through its positive effect on innovation, productivity growth, and job creation. In entrepreneurial research, one of the most important problems is to define the factors that actually determine entrepreneurial action. This study analyzes that question in the case of Germany by taking an aggregated approach that focuses on socio-demographic and economic determinants of regional entrepreneurship. Based on a literature review of German and international regional-level research, six hypotheses are developed and empirically tested using the most recent available data on 385 German regions as units of analysis. The results are surprising. In the case of household income, unemployment, education and marital status the relationship is significant but contrary to earlier research. Only regional age structure seems to be a stable predictor of regional entrepreneurship. The results indicate that in recent years there was a major shift in the determinants and characteristics of entrepreneurship in Germany.

  19. Competencies in Higher Education System: an Empirical Analysis of Employers` Perceptions

    Directory of Open Access Journals (Sweden)

    Adela Deaconu

    2014-08-01

    Full Text Available This study offers insight into the European Qualifications Framework (EQF, as agreed and detailed by the Romanian qualifications framework, applied to the economic sector. By means of a survey conducted on 92 employing companies, it validates the importance of competencies for the Romanian labour market and employers` degree of satisfaction with the competencies of business graduates. In terms of typology, employers attach more importance to transversal competencies than to professional competencies, both at conceptual level and as degree of acquirement following higher education. The empirical analysis provides data on employers` ranking of transversal and professional competencies and identifies the classes of competencies deemed to be in short supply on the labour market. Through its results, the study enhances the relationship between the higher education system and the labour market, providing key information for an efficient implementation of the competence-based education system.

  20. A Meta-Analysis of Empirically Tested School-Based Dating Violence Prevention Programs

    Directory of Open Access Journals (Sweden)

    Sarah R. Edwards

    2014-05-01

    Full Text Available Teen dating violence prevention programs implemented in schools and empirically tested were subjected to meta-analysis. Eight studies met criteria for inclusion, consisting of both within and between designs. Overall, the weighted mean effect size (ES across studies was significant, ESr = .11; 95% confidence interval (CI = [.08, .15], p < .0001, showing an overall positive effect of the studied prevention programs. However, 25% of the studies showed an effect in the negative direction, meaning students appeared to be more supportive of dating violence after participating in a dating violence prevention program. This heightens the need for thorough program evaluation as well as the need for decision makers to have access to data about the effectiveness of programs they are considering implementing. Further implications of the results and recommendations for future research are discussed.

  1. Spectral Analysis of Surface Wave for Empirical Elastic Design of Anchored Foundations

    Directory of Open Access Journals (Sweden)

    S. E. Chen

    2012-01-01

    Full Text Available Helical anchors are vital support components for power transmission lines. Failure of a single anchor can lead to the loss of an entire transmission line structure which results in the loss of power for downstream community. Despite being important, it is not practical to use conventional borehole method of subsurface exploration, which is labor intensive and costly, for estimating soil properties and anchor holding capacity. This paper describes the use of an empirical and elasticity-based design technique coupled with the spectral analysis of surface wave (SASW technique to provide subsurface information for anchor foundation designs. Based on small-strain wave propagation, SASW determines shear wave velocity profile which is then correlated to anchor holding capacity. A pilot project involving over 400 anchor installations has been performed and demonstrated that such technique is reliable and can be implemented into transmission line structure designs.

  2. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Science.gov (United States)

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended. PMID:25013868

  3. Nonparametric dark energy reconstruction from supernova data.

    Science.gov (United States)

    Holsclaw, Tracy; Alam, Ujjaini; Sansó, Bruno; Lee, Herbert; Heitmann, Katrin; Habib, Salman; Higdon, David

    2010-12-10

    Understanding the origin of the accelerated expansion of the Universe poses one of the greatest challenges in physics today. Lacking a compelling fundamental theory to test, observational efforts are targeted at a better characterization of the underlying cause. If a new form of mass-energy, dark energy, is driving the acceleration, the redshift evolution of the equation of state parameter w(z) will hold essential clues as to its origin. To best exploit data from observations it is necessary to develop a robust and accurate reconstruction approach, with controlled errors, for w(z). We introduce a new, nonparametric method for solving the associated statistical inverse problem based on Gaussian process modeling and Markov chain Monte Carlo sampling. Applying this method to recent supernova measurements, we reconstruct the continuous history of w out to redshift z=1.5.

  4. Nonparametric k-nearest-neighbor entropy estimator.

    Science.gov (United States)

    Lombardi, Damiano; Pant, Sanjay

    2016-01-01

    A nonparametric k-nearest-neighbor-based entropy estimator is proposed. It improves on the classical Kozachenko-Leonenko estimator by considering nonuniform probability densities in the region of k-nearest neighbors around each sample point. It aims to improve the classical estimators in three situations: first, when the dimensionality of the random variable is large; second, when near-functional relationships leading to high correlation between components of the random variable are present; and third, when the marginal variances of random variable components vary significantly with respect to each other. Heuristics on the error of the proposed and classical estimators are presented. Finally, the proposed estimator is tested for a variety of distributions in successively increasing dimensions and in the presence of a near-functional relationship. Its performance is compared with a classical estimator, and a significant improvement is demonstrated.

  5. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  6. Nonparametric Maximum Entropy Estimation on Information Diagrams

    CERN Document Server

    Martin, Elliot A; Meinke, Alexander; Děchtěrenko, Filip; Davidsen, Jörn

    2016-01-01

    Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We furthe...

  7. Nonparametric estimation of employee stock options

    Institute of Scientific and Technical Information of China (English)

    FU Qiang; LIU Li-an; LIU Qian

    2006-01-01

    We proposed a new model to price employee stock options (ESOs). The model is based on nonparametric statistical methods with market data. It incorporates the kernel estimator and employs a three-step method to modify BlackScholes formula. The model overcomes the limits of Black-Scholes formula in handling option prices with varied volatility. It disposes the effects of ESOs self-characteristics such as non-tradability, the longer term for expiration, the early exercise feature, the restriction on shorting selling and the employee's risk aversion on risk neutral pricing condition, and can be applied to ESOs valuation with the explanatory variable in no matter the certainty case or random case.

  8. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  9. An Empirical Bayes Enhancement of Mantel-Haenszel DIF Analysis for Computer-Adaptive Tests. LSAC Research Report Series.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy T.

    This study investigated the applicability to computerized adaptive testing (CAT) data of a differential item functioning (DIF) analysis that involves an empirical Bayes (EB) enhancement of the popular Mantel Haenszel (MH) DIF analysis method. The computerized Law School Admission Test (LSAT) assumed for this study was similar to that currently…

  10. Ensemble Empirical Mode Decomposition: Image Data Analysis with White-noise Reflection

    Directory of Open Access Journals (Sweden)

    M. Kopecký

    2010-01-01

    Full Text Available During the last decade, Zhaohua Wu and Norden E. Huang announced a new improvement of the original Empirical Mode Decomposition method (EMD. Ensemble Empirical Mode Decomposition and its abbreviation EEMD represents a major improvement with great versatility and robustness in noisy data filtering. EEMD consists of sifting and making an ensemble of a white noise-added signal, and treats the mean value as the final true result. This is due to the use of a finite, not infinitesimal, amplitude of white noise which forces the ensemble to exhaust all possible solutions in the sifting process. These steps collate signals of different scale in a proper intrinsic mode function (IMF dictated by the dyadic filter bank. As EEMD is a time–space analysis method, the added white noise is averaged out with a sufficient number of trials. Here, the only persistent part that survives the averaging process is the signal component (original data, which is then treated as the true and more physically meaningful answer. The main purpose of adding white noise was to provide a uniform reference frame in the time–frequency space. The added noise collates the portion of the signal of comparable scale in a single IMF. Image data taken as time series is a non-stationary and nonlinear process to which the new proposed EEMD method can be fitted out. This paper reviews the new approach of using EEMD and demonstrates its use on the example of image data analysis, making use of some advantages of the statistical characteristics of white noise. This approach helps to deal with omnipresent noise.

  11. A nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  12. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    S. Gugushvili; F. van der Meulen; P. Spreij

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context, whic

  13. Empirical Markov Chain Monte Carlo Bayesian analysis of fMRI data.

    Science.gov (United States)

    de Pasquale, F; Del Gratta, C; Romani, G L

    2008-08-01

    In this work an Empirical Markov Chain Monte Carlo Bayesian approach to analyse fMRI data is proposed. The Bayesian framework is appealing since complex models can be adopted in the analysis both for the image and noise model. Here, the noise autocorrelation is taken into account by adopting an AutoRegressive model of order one and a versatile non-linear model is assumed for the task-related activation. Model parameters include the noise variance and autocorrelation, activation amplitudes and the hemodynamic response function parameters. These are estimated at each voxel from samples of the Posterior Distribution. Prior information is included by means of a 4D spatio-temporal model for the interaction between neighbouring voxels in space and time. The results show that this model can provide smooth estimates from low SNR data while important spatial structures in the data can be preserved. A simulation study is presented in which the accuracy and bias of the estimates are addressed. Furthermore, some results on convergence diagnostic of the adopted algorithm are presented. To validate the proposed approach a comparison of the results with those from a standard GLM analysis, spatial filtering techniques and a Variational Bayes approach is provided. This comparison shows that our approach outperforms the classical analysis and is consistent with other Bayesian techniques. This is investigated further by means of the Bayes Factors and the analysis of the residuals. The proposed approach applied to Blocked Design and Event Related datasets produced reliable maps of activation.

  14. Empirical evidence about inconsistency among studies in a pair-wise meta-analysis.

    Science.gov (United States)

    Rhodes, Kirsty M; Turner, Rebecca M; Higgins, Julian P T

    2016-12-01

    This paper investigates how inconsistency (as measured by the I(2) statistic) among studies in a meta-analysis may differ, according to the type of outcome data and effect measure. We used hierarchical models to analyse data from 3873 binary, 5132 continuous and 880 mixed outcome meta-analyses within the Cochrane Database of Systematic Reviews. Predictive distributions for inconsistency expected in future meta-analyses were obtained, which can inform priors for between-study variance. Inconsistency estimates were highest on average for binary outcome meta-analyses of risk differences and continuous outcome meta-analyses. For a planned binary outcome meta-analysis in a general research setting, the predictive distribution for inconsistency among log odds ratios had median 22% and 95% CI: 12% to 39%. For a continuous outcome meta-analysis, the predictive distribution for inconsistency among standardized mean differences had median 40% and 95% CI: 15% to 73%. Levels of inconsistency were similar for binary data measured by log odds ratios and log relative risks. Fitted distributions for inconsistency expected in continuous outcome meta-analyses using mean differences were almost identical to those using standardized mean differences. The empirical evidence on inconsistency gives guidance on which outcome measures are most likely to be consistent in particular circumstances and facilitates Bayesian meta-analysis with an informative prior for heterogeneity. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd.

  15. Network Analysis Approach to Stroke Care and Assistance Provision: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Szczygiel Nina

    2017-06-01

    Full Text Available To model and analyse stroke care and assistance provision in the Portuguese context from the network perspective. We used the network theory as a theoretical foundation for the study. The model proposed by Frey et al. (2006 was used to elicit and comprehend possible interactions and relations between organisations expected to be involved in the provision of care and assistance to stroke patients in their pathway to rehabilitation. Providers were identified and contacted to evaluate the nature and intensity of relationships. Network analysis was performed with the NodeXL software package. Analysis of 509 entities based on about 260 000 entries indicates that stroke care provision in the evaluated context is best captured in the coalition-collaboration setting, which appears to best demonstrate the character of the network. Information from analysis of the collaboration stage was not sufficient to determine the network dynamics. Application of the network theory to understand interorganisational dynamics of the complex health care context. Empirical validation of the model proposed by Frey et al. (2006 in terms of its operationalisation and the way it actually reflects the practical context. Examination and analysis of interorganisational relationships and its contribution to management of compound health care context involving actors from various sectors.

  16. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  17. An empirical study to determine the critical success factors of export industry

    Directory of Open Access Journals (Sweden)

    Masoud Babakhani

    2011-01-01

    Full Text Available Exporting goods and services play an important role on economy of developing countries. There are many countries in the world whose economy is solely based on exporting raw materials such as oil and gas. Many believe that countries cannot develop their economy as long as they rely on exporting one single group of raw materials. Therefore, there is a need to help other sectors of industries build good infrastructure for exporting diversified products. In this paper, we perform an empirical analysis to determine the critical success factors on exporting different goods. The results are analyzed using some statistical non-parametric methods and some useful guidelines are also suggested.

  18. Comparative empirical analysis of temporal relationships between construction investment and economic growth in the United States

    Directory of Open Access Journals (Sweden)

    Navid Ahmadi

    2017-09-01

    Full Text Available The majority of policymakers believe that investments in construction infrastructure would boost the economy of the United States (U.S.. They also assume that construction investment in infrastructure has similar impact on the economies of different U.S. states. In contrast, there have been studies showing the negative impact of construction activities on the economy. However, there has not been any research attempt to empirically test the temporal relationships between construction investment and economic growth in the U.S. states, to determine the longitudinal impact of construction investment on the economy of each state. The objective of this study is to investigate whether Construction Value Added (CVA is the leading (or lagging indicator of real Gross Domestic Product (real GDP for every individual state of the U.S. using empirical time series tests. The results of Granger causality tests showed that CVA is a leading indicator of state real GDP in 18 states and the District of Columbia; real GDP is a leading indicator of CVA in 10 states and the District of Columbia. There is a bidirectional relationship between CVA and real GDP in 5 states and the District of Columbia. In 8 states and the District of Columbia, not only do CVA and real GDP have leading/lagging relationships, but they are also cointegrated. These results highlight the important role of the construction industry in these states. The results also show that leading (or lagging lengths vary for different states. The results of the comparative empirical analysis reject the hypothesis that CVA is a leading indicator of real GDP in the states with the highest shares of construction in the real GDP. The findings of this research contribute to the state of knowledge by quantifying the temporal relationships between construction investment and economic growth in the U.S. states. It is expected that the results help policymakers better understand the impact of construction investment

  19. Exploring the Multi-Scale Statistical Analysis of Ionospheric Scintillation via Wavelets and Empirical Mode Decomposition

    Science.gov (United States)

    Piersanti, Mirko; Materassi, Massimo; Spogli, Luca; Cicone, Antonio; Alberti, Tommaso

    2016-04-01

    Highly irregular fluctuations of the power of trans-ionospheric GNSS signals, namely radio power scintillation, are, at least to a large extent, the effect of ionospheric plasma turbulence, a by-product of the non-linear and non-stationary evolution of the plasma fields defining the Earth's upper atmosphere. One could expect the ionospheric turbulence characteristics of inter-scale coupling, local randomness and high time variability to be inherited by the scintillation on radio signals crossing the medium. On this basis, the remote sensing of local features of the turbulent plasma could be expected as feasible by studying radio scintillation. The dependence of the statistical properties of the medium fluctuations on the space- and time-scale is the distinctive character of intermittent turbulent media. In this paper, a multi-scale statistical analysis of some samples of GPS radio scintillation is presented: the idea is that assessing how the statistics of signal fluctuations vary with time scale under different Helio-Geophysical conditions will be of help in understanding the corresponding multi-scale statistics of the turbulent medium causing that scintillation. In particular, two techniques are tested as multi-scale decomposition schemes of the signals: the discrete wavelet analysis and the Empirical Mode Decomposition. The discussion of the results of the one analysis versus the other will be presented, trying to highlight benefits and limits of each scheme, also under suitably different helio-geophysical conditions.

  20. Empirical analysis and modeling of errors of atmospheric profiles from GPS radio occultation

    Directory of Open Access Journals (Sweden)

    B. Scherllin-Pirscher

    2011-05-01

    Full Text Available The utilization of radio occultation (RO data in atmospheric studies requires precise knowledge of error characteristics. We present results of an empirical error analysis of GPS radio occultation (RO bending angle, refractivity, dry pressure, dry geopotential height, and dry temperature. We find very good agreement between data characteristics of different missions (CHAMP, GRACE-A, and Formosat-3/COSMIC (F3C. In the global mean, observational errors (standard deviation from "true" profiles at mean tangent point location agree within 0.3 % in bending angle, 0.1 % in refractivity, and 0.2 K in dry temperature at all altitude levels between 4 km and 35 km. Above ≈20 km, the observational errors show a strong seasonal dependence at high latitudes. Larger errors occur in hemispheric wintertime and are associated mainly with background data used in the retrieval process. The comparison between UCAR and WEGC results (both data centers have independent inversion processing chains reveals different magnitudes of observational errors in atmospheric parameters, which are attributable to different background fields used. Based on the empirical error estimates, we provide a simple analytical error model for GPS RO atmospheric parameters and account for vertical, latitudinal, and seasonal variations. In the model, which spans the altitude range from 4 km to 35 km, a constant error is adopted around the tropopause region amounting to 0.8 % for bending angle, 0.35 % for refractivity, 0.15 % for dry pressure, 10 m for dry geopotential height, and 0.7 K for dry temperature. Below this region the observational error increases following an inverse height power-law and above it increases exponentially. The observational error model is the same for UCAR and WEGC data but due to somewhat different error characteristics below about 10 km and above about 20 km some parameters have to be adjusted. Overall, the observational error model is easily applicable and

  1. Genetic diversity analysis of highly incomplete SNP genotype data with imputations: an empirical assessment.

    Science.gov (United States)

    Fu, Yong-Bi

    2014-03-13

    Genotyping by sequencing (GBS) recently has emerged as a promising genomic approach for assessing genetic diversity on a genome-wide scale. However, concerns are not lacking about the uniquely large unbalance in GBS genotype data. Although some genotype imputation has been proposed to infer missing observations, little is known about the reliability of a genetic diversity analysis of GBS data, with up to 90% of observations missing. Here we performed an empirical assessment of accuracy in genetic diversity analysis of highly incomplete single nucleotide polymorphism genotypes with imputations. Three large single-nucleotide polymorphism genotype data sets for corn, wheat, and rice were acquired, and missing data with up to 90% of missing observations were randomly generated and then imputed for missing genotypes with three map-independent imputation methods. Estimating heterozygosity and inbreeding coefficient from original, missing, and imputed data revealed variable patterns of bias from assessed levels of missingness and genotype imputation, but the estimation biases were smaller for missing data without genotype imputation. The estimates of genetic differentiation were rather robust up to 90% of missing observations but became substantially biased when missing genotypes were imputed. The estimates of topology accuracy for four representative samples of interested groups generally were reduced with increased levels of missing genotypes. Probabilistic principal component analysis based imputation performed better in terms of topology accuracy than those analyses of missing data without genotype imputation. These findings are not only significant for understanding the reliability of the genetic diversity analysis with respect to large missing data and genotype imputation but also are instructive for performing a proper genetic diversity analysis of highly incomplete GBS or other genotype data.

  2. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    Science.gov (United States)

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  3. 随机右删失非参数回归模型的影响分析%Influence Analysis of Non-parametric Regression Model with Random Right Censorship

    Institute of Scientific and Technical Information of China (English)

    王淑玲; 冯予; 刘刚

    2012-01-01

    In this paper, the primary model is transformed to non-parametric regression model; Then, local influence is discussed and concise influence matrix is obtained; At last, example is given to illustrate our results.%将随机删失非参数固定设计回归模型转化为非参数回归模型进行研究;然后对此模型作了局部影响分析,得到计算影响矩阵及最大影响曲率方向的简洁公式;最后通过实例分析,验证了分析方法的有效性.

  4. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods.

  5. EVALUATION OF RUTTING DEPTH IN FLEXIBLE PAVEMENTS BY USING FINITE ELEMENT ANALYSIS AND LOCAL EMPIRICAL MODEL

    Directory of Open Access Journals (Sweden)

    Alaa H. Abed

    2012-01-01

    Full Text Available The objective of this research is to predict rut depth in local flexible pavements. Predication model in pavement performance is the process that used to estimate the parameter values which related to pavement structure, environmental condition and traffic loading. The different local empirical models have been used to calculate permanent deformation which include environmental and traffic conditions. Finite element analysis through ANSYS computer software is used to analyze two dimensional linear elastic plane strain problem through (Plane 82 elements. Standard Axle Load (ESAL of 18 kip (80 kN loading on an axle with dual set of tires, the wheel spacing is 13.5 in (343 mm with tire contact pressure of 87 psi (0.6 MPa is used. The pavement system is assumed to be an elastic multi-layers system with each layer being isotropic, homogeneous with specified resilient modulus and Poisson ratio. Each layer is to extend to infinity in the horizontal direction and have a finite thickness except the bottom layer. The analysis of results show that, although, the stress level decrease 14% in the leveling course and 27% in the base course, the rut depth is increased by 12 and 28% in that layers respectively because the material properties is changed.

  6. Web-based application for Data INterpolation Empirical Orthogonal Functions (DINEOF) analysis

    Science.gov (United States)

    Tomazic, Igor; Alvera-Azcarate, Aida; Barth, Alexander; Beckers, Jean-Marie

    2014-05-01

    DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite datasets, as well as for the reduction of noise and detection of outliers. DINEOF is openly available as a series of Fortran routines to be compiled by the user, and as binaries (that can be run directly without any compilation) both for Windows and Linux platforms. In order to facilitate the use of DINEOF and increase the number of interested users, we developed a web-based interface for DINEOF with the necessary parameters available to run high-quality DINEOF analysis. This includes choosing variable within selected dataset, defining a domain, time range, filtering criteria based on available variables in the dataset (e.g. quality flag, satellite zenith angle …) and defining necessary DINEOF parameters. Results, including reconstructed data and calculated EOF modes will be disseminated in NetCDF format using OpenDAP and WMS server allowing easy visualisation and analysis. First, we will include several satellite datasets of sea surface temperature and chlorophyll concentration obtained from MyOcean data centre and already remapped to the regular grid (L3C). Later, based on user's request, we plan to extend number of datasets available for reconstruction.

  7. Empirical Analysis on Chinese Enterprise-enterprise Patent Cooperation in Food Industry

    Directory of Open Access Journals (Sweden)

    Lanqing Liu

    2015-03-01

    Full Text Available This study aims to investigate the multidisciplinary knowledge network of Chinese enterprise-enterprise patent cooperation in food industry. Multidisciplinary knowledge network which come from enterprise-enterprise patent cooperation is an important network platform to improve innovation performance and to implement multidisciplinary and multi-sectoral cooperation. The actual dynamic variation and structure characteristics of the network and the distribution characteristics of disciplines in the network besides are analyzed by the method of social network analysis based on 36731 pieces of enterprise-enterprise cooperation patent in food industry which come from State Intellectual Property Office of the People’ Republic of China from 1985 to 2010. The multidisciplinary knowledge network whose structure is comparative perfect as the result shows that its carriers, the enterprises, play a positive role on knowledge flow and innovation performance in food development. At the same time, the method and the result of analysis in this study are provided both for the government and enterprises as theoretical reference and empirical material to their innovation strategy.

  8. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    Science.gov (United States)

    Wen-Bo, Wang; Xiao-Dong, Zhang; Yuchan, Chang; Xiang-Li, Wang; Zhao, Wang; Xi, Chen; Lei, Zheng

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. Project supported by the National Science and Technology, China (Grant No. 2012BAJ15B04), the National Natural Science Foundation of China (Grant Nos. 41071270 and 61473213), the Natural Science Foundation of Hubei Province, China (Grant No. 2015CFB424), the State Key Laboratory Foundation of Satellite Ocean Environment Dynamics, China (Grant No. SOED1405), the Hubei Provincial Key Laboratory Foundation of Metallurgical Industry Process System Science, China (Grant No. Z201303), and the Hubei Key Laboratory Foundation of Transportation Internet of Things, Wuhan University of Technology, China (Grant No.2015III015-B02).

  9. Empirical analysis of Macedonian export structure: The role of metal industry

    Directory of Open Access Journals (Sweden)

    Lazarov Darko

    2017-01-01

    Full Text Available The paper investigates the Macedonian export structure and export performance by using several indexes: product complexity, export sophistication, export diversity, and export ubiquity for the period 1995-2014. The empirical analysis is based on the application of product space methodology (based on network theory developed by Hidalgo et al. (2007 with data from UN comtrade database. Moreover, the paper analyzes the performance of metal industry and its productivity as one of the biggest exporting sector and investigates its capacity for further product diversification. The estimated results indicate that Macedonian export basket is highly concentrated (the top ten products have nearly half of the total country's export and more than 70 percent of the export goes to several countries and more importantly the export structure is composed of products with low complexity level. Additionally, the results show that the export structure is less diversified, making the economy very sensitive on the external shocks. On the other side, the analysis of metal industry performance indicates that this industry has comparative advantages (RCA=2.16 and more importantly it has strong capacity for further diversification by building new capabilities for production of more complexed products. So, the suggestion to policymakers in Macedonia is to be created an active industrial policy that will stimulate the structural transformation of metal industry towards products with higher value added.

  10. Promoting Sustainability Transparency in European Local Governments: An Empirical Analysis Based on Administrative Cultures

    Directory of Open Access Journals (Sweden)

    Andrés Navarro-Galera

    2017-03-01

    Full Text Available Nowadays, the transparency of governments with respect to the sustainability of public services is a very interesting issue for stakeholders and academics. It has led to previous research and international organisations (EU, IMF, OECD, United Nations, IFAC, G-20, World Bank to recommend promotion of the online dissemination of economic, social and environmental information. Based on previous studies about e-government and the influence of administrative cultures on governmental accountability, this paper seeks to identify political actions useful to improve the practices of transparency on economic, social and environmental sustainability in European local governments. We perform a comparative analysis of sustainability information published on the websites of 72 local governments in 10 European countries grouped into main three cultural contexts (Anglo-Saxon, Southern European and Nordic. Using international sustainability reporting guidelines, our results reveal significant differences in local government transparency in each context. The most transparent local governments are the Anglo-Saxon ones, followed by Southern European and Nordic governments. Based on individualized empirical results for each administrative style, our conclusions propose useful policy interventions to enhance sustainability transparency within each cultural tradition, such as development of legal rules on transparency and sustainability, tools to motivate local managers for online diffusion of sustainability information and analysis of information needs of stakeholders.

  11. Research on Browsing Behavior in the Libraries: An Empirical Analysis of Consequences, Success and Influences

    Directory of Open Access Journals (Sweden)

    Shan-Ju L. Chang

    2000-12-01

    Full Text Available Browsing as an important part of human information behavior has been observed and investigated in the context of information seeking in the library in general and has assumed greater importance in human-machine interaction in particular. However, the nature and consequences of browsing are not well understood, and little is known of the success rate of such behavior.In this research, exploratory empirical case studies from three types of libraries were conducted, using questionnaires, observation logs, interviews, and computer search logs, to derive the empirical evidence to understand, from the user point of view, what are the consequences of browsing, what constitutes successful browsing, and what factors influence the extent of browsing. Content analysis and statistical analysis were conducted to analyze and synthesize the data. The research results show: (1 There are nine categories of the consequence of browsing, including accidental findings, modification of information need, found the desirable information, learning, feeling relaxation/recreational, information gathering, keeping updated, satisfying curiosity, and not finding what is needed. (2 Four factors that produce successful browsing: intention, the amount or quality of information, the utility of what is found, and help for solving problem or making judgment. (3 Three types of reasons for unsuccessful experience in browsing: not finding what one wanted, inadequate volume or quality of information, and not finding some things useful or interesting. (4 Three types of reasons for partial success: found the intended object but not happy with the quality or amount of information in it, not finding what one wanted but discovering new or potential useful information, not accomplish one purpose but achieve another one given multiple purposes. (5 The influential factors that affect the extent one engages in browsing include browser’s time, scheme of information organization, proximity to

  12. On the relationship between fiscal plans in the European Union: an empirical analysis based on real-time data

    NARCIS (Netherlands)

    M. Giuliodori; R. Beetsma

    2008-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public i

  13. Unpredictable bias when using the missing indicator method or complete case analysis for missing confounder values: an empirical example.

    NARCIS (Netherlands)

    Knol, M.J.; Janssen, K.J.; Donders, A.R.T.; Egberts, A.C.G.; Heerdink, E.R.; Grobbee, D.E.; Moons, K.G.; Geerlings, M.I.

    2010-01-01

    OBJECTIVE: Missing indicator method (MIM) and complete case analysis (CC) are frequently used to handle missing confounder data. Using empirical data, we demonstrated the degree and direction of bias in the effect estimate when using these methods compared with multiple imputation (MI). STUDY DESIGN

  14. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  15. Nonparametric Detection of Geometric Structures Over Networks

    Science.gov (United States)

    Zou, Shaofeng; Liang, Yingbin; Poor, H. Vincent

    2017-10-01

    Nonparametric detection of existence of an anomalous structure over a network is investigated. Nodes corresponding to the anomalous structure (if one exists) receive samples generated by a distribution q, which is different from a distribution p generating samples for other nodes. If an anomalous structure does not exist, all nodes receive samples generated by p. It is assumed that the distributions p and q are arbitrary and unknown. The goal is to design statistically consistent tests with probability of errors converging to zero as the network size becomes asymptotically large. Kernel-based tests are proposed based on maximum mean discrepancy that measures the distance between mean embeddings of distributions into a reproducing kernel Hilbert space. Detection of an anomalous interval over a line network is first studied. Sufficient conditions on minimum and maximum sizes of candidate anomalous intervals are characterized in order to guarantee the proposed test to be consistent. It is also shown that certain necessary conditions must hold to guarantee any test to be universally consistent. Comparison of sufficient and necessary conditions yields that the proposed test is order-level optimal and nearly optimal respectively in terms of minimum and maximum sizes of candidate anomalous intervals. Generalization of the results to other networks is further developed. Numerical results are provided to demonstrate the performance of the proposed tests.

  16. Nonparametric Bayesian inference of the microcanonical stochastic block model

    Science.gov (United States)

    Peixoto, Tiago P.

    2017-01-01

    A principled approach to characterize the hidden modular structure of networks is to formulate generative models and then infer their parameters from data. When the desired structure is composed of modules or "communities," a suitable choice for this task is the stochastic block model (SBM), where nodes are divided into groups, and the placement of edges is conditioned on the group memberships. Here, we present a nonparametric Bayesian method to infer the modular structure of empirical networks, including the number of modules and their hierarchical organization. We focus on a microcanonical variant of the SBM, where the structure is imposed via hard constraints, i.e., the generated networks are not allowed to violate the patterns imposed by the model. We show how this simple model variation allows simultaneously for two important improvements over more traditional inference approaches: (1) deeper Bayesian hierarchies, with noninformative priors replaced by sequences of priors and hyperpriors, which not only remove limitations that seriously degrade the inference on large networks but also reveal structures at multiple scales; (2) a very efficient inference algorithm that scales well not only for networks with a large number of nodes and edges but also with an unlimited number of modules. We show also how this approach can be used to sample modular hierarchies from the posterior distribution, as well as to perform model selection. We discuss and analyze the differences between sampling from the posterior and simply finding the single parameter estimate that maximizes it. Furthermore, we expose a direct equivalence between our microcanonical approach and alternative derivations based on the canonical SBM.

  17. Analysis of surface soil moisture patterns in agricultural landscapes using empirical orthogonal functions

    Directory of Open Access Journals (Sweden)

    W. Korres

    2009-08-01

    Full Text Available Soil moisture is one of the fundamental variables in hydrology, meteorology and agriculture. Nevertheless, its spatio-temporal patterns in agriculturally used landscapes affected by multiple natural (rainfall, soil, topography etc. and agronomic (fertilisation, soil management etc. factors are often not well known. The aim of this study is to determine the dominant factors governing the spatio-temporal patterns of surface soil moisture in a grassland and an arable land test site within the Rur catchment in Western Germany. Surface soil moisture (0–6 cm has been measured in an approx. 50×50 m grid at 14 and 17 dates (May 2007 to November 2008 in both test sites. To analyse spatio-temporal patterns of surface soil moisture, an Empirical Orthogonal Function (EOF analysis was applied and the results were correlated with parameters derived from topography, soil, vegetation and land management to connect the pattern to related factors and processes. For the grassland test site, the analysis results in one significant spatial structure (first EOF, which explains about 57.5% of the spatial variability connected to soil properties and topography. The weight of the first spatial EOF is stronger on wet days. The highest temporal variability can be found in locations with a high percentage of soil organic carbon (SOC. For the arable land test site, the analysis yields two significant spatial structures, the first EOF, explaining 38.4% of the spatial variability, shows a highly significant correlation to soil properties, namely soil texture. The second EOF, explaining 28.3% of the spatial variability, is connected to differences in land management. The soil moisture in the arable land test site varies more during dry and wet periods on locations with low porosity.

  18. Weak Form Efficiency of the Nigerian Stock Market: An Empirical Analysis (1984 – 2009

    Directory of Open Access Journals (Sweden)

    Pyemo Afego

    2012-01-01

    Full Text Available This paper examines the weak-form of the efficient markets hypothesis for the Nigerian Stock Exchange (NSE by testing for random walks in the monthly index returns over the period 1984-2009. The results of the non-parametric runs test show that index returns on the NSE display a predictable component, thus suggesting that traders can earn superior returns by employing trading rules. Statistically significant deviations from randomness are also suggestive of sub-optimal allocation of investment capital within the economy. The findings, in general, contradict the weak-form of the efficient markets hypothesis, and a range of policy strategies for improving the allocative capacity and quality of the information environment of the NSE are discussed.

  19. A Comparison of Parametric and Non-Parametric Methods Applied to a Likert Scale.

    Science.gov (United States)

    Mircioiu, Constantin; Atkinson, Jeffrey

    2017-05-10

    A trenchant and passionate dispute over the use of parametric versus non-parametric methods for the analysis of Likert scale ordinal data has raged for the past eight decades. The answer is not a simple "yes" or "no" but is related to hypotheses, objectives, risks, and paradigms. In this paper, we took a pragmatic approach. We applied both types of methods to the analysis of actual Likert data on responses from different professional subgroups of European pharmacists regarding competencies for practice. Results obtained show that with "large" (>15) numbers of responses and similar (but clearly not normal) distributions from different subgroups, parametric and non-parametric analyses give in almost all cases the same significant or non-significant results for inter-subgroup comparisons. Parametric methods were more discriminant in the cases of non-similar conclusions. Considering that the largest differences in opinions occurred in the upper part of the 4-point Likert scale (ranks 3 "very important" and 4 "essential"), a "score analysis" based on this part of the data was undertaken. This transformation of the ordinal Likert data into binary scores produced a graphical representation that was visually easier to understand as differences were accentuated. In conclusion, in this case of Likert ordinal data with high response rates, restraining the analysis to non-parametric methods leads to a loss of information. The addition of parametric methods, graphical analysis, analysis of subsets, and transformation of data leads to more in-depth analyses.

  20. Development of Items for a Pedagogical Content Knowledge Test Based on Empirical Analysis of Pupils' Errors

    Science.gov (United States)

    Jüttner, Melanie; Neuhaus, Birgit J.

    2012-05-01

    In view of the lack of instruments for measuring biology teachers' pedagogical content knowledge (PCK), this article reports on a study about the development of PCK items for measuring teachers' knowledge of pupils' errors and ways for dealing with them. This study investigated 9th and 10th grade German pupils' (n = 461) drawings in an achievement test about the knee-jerk in biology, which were analysed by using the inductive qualitative analysis of their content. The empirical data were used for the development of the items in the PCK test. The validation of the items was determined with think-aloud interviews of German secondary school teachers (n = 5). If the item was determined, the reliability was tested by the results of German secondary school biology teachers (n = 65) who took the PCK test. The results indicated that these items are satisfactorily reliable (Cronbach's alpha values ranged from 0.60 to 0.65). We suggest a larger sample size and American biology teachers be used in our further studies. The findings of this study about teachers' professional knowledge from the PCK test could provide new information about the influence of teachers' knowledge on their pupils' understanding of biology and their possible errors in learning biology.

  1. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  2. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  3. An Empirical Analysis of Performance Evaluation of University Teachers Based on KPI

    Directory of Open Access Journals (Sweden)

    Zhu Chuanshu

    2015-01-01

    Full Text Available Except for the influential graduates, the content of the development core of universities needs to improve the scientific allocation of teacher resources, and the most relevant content to human resource of teacher is the salary incentive system. While the system related to performance appraisal system is the perfor-mance assessment system, so the sound and perfect performance appraisal system of teachers in Colleges and universities is conducive to the strategic development of university. This article uses KPI theory in index design for quality content of university teachers, aiming at abstracting the key quality therefrom, constructing KPI performance appraisal system complying to 80/20 and provide theoretical basis for improvement of human resources system in Colleges and promotion of the development of universities. This article provides five key types of quality of college teachers and refinable, quantifiable and behavioral key qualities, and conducts an empirical demonstration of weight of the corresponding index by the questionnaire survey, expert interview and AHP analysis method.

  4. An empirical analysis of the impact of renewable energy deployment on local sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Del Rio, Pablo [Institute for Public Goods and Policies (IPP), Centro de Ciencias Humanas y Sociales, Consejo Superior de Investigaciones Cientificas (CSIC), C/Albasanz 26-28, 28037 Madrid (Spain); Burguillo, Mercedes [Facultad de Ciencias Economicas y Empresariales, Universidad de Alcala, Pza. de la Victoria 3, 28802 Alcala de Henares, Madrid (Spain)

    2009-08-15

    It is usually mentioned that renewable energy sources (RES) have a large potential to contribute to the sustainable development of specific territories by providing them with a wide variety of socioeconomic benefits, including diversification of energy supply, enhanced regional and rural development opportunities, creation of a domestic industry and employment opportunities. The analysis of these benefits has usually been too general (i.e., mostly at the national level) and a focus on the regional and especially the local level has been lacking. This paper empirically analyses those benefits, by applying a conceptual and methodological framework previously developed by the authors to three renewable energy technologies in three different places in Spain. With the help of case studies, the paper shows that the contribution of RES to the economic and social dimensions of sustainable development might be significant. Particularly important is employment creation in these areas. Although, in absolute terms, the number of jobs created may not be high, it may be so with respect to the existing jobs in the areas considered. Socioeconomic benefits depend on several factors, and not only on the type of renewable energy, as has usually been mentioned. The specific socioeconomic features of the territories, including the productive structure of the area, the relationships between the stakeholders and the involvement of the local actors in the renewable energy project may play a relevant role in this regard. Furthermore, other local (socioeconomic) sustainability aspects beyond employment creation should be considered. (author)

  5. Review of U.S. ESCO industry market trends: An empirical analysis of project data

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.; Singer, Terry E.

    2003-03-01

    This article summarizes a comprehensive empirical analysis of U.S. Energy Service Company (ESCO) industry trends and performance. We employ two parallel analytical approaches: a comprehensive survey of firms to estimate total industry size and a database of {approx}1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US $2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m2/year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling ''energy solutions,'' with energy efficiency part of a package. We conclude that a private sector energy-efficiency services industry that targets large commercial and industrial customers is viable and self-sustaining with appropriate policy support both financial and non-financial.

  6. The Determinants of the Global Mobile Telephone Deployment: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Sheikh Taher ABU

    2010-01-01

    Full Text Available This study aims to analyze the global mobile phones by examining the instruments stimulating the diffusion pattern. A rigorous demand model is estimated using global mobile telecommu-nications panel dataset comprised with 51 countries classified in order to World Bank income categories from 1990-2007. In particular, the paper examines what factors contribute the most to the deployment of global mobile telephones. To construct an econometric model, the number of subscribers to mobile phone per 100 inhabitants is taken as dependent variable, while the following groups of variables (1 GDP per capita income and charges, (2 competition policies (3 telecom infrastructure (4 technological innovations (5 others are selected as independent variables. Estimation results report the presence of substantial disparity among groups. Additionally GDP per capita income and own-price elasticity comprised with call rate, subscription charges, are reported. The analysis of impulse responses for price, competition policies, and technological innovations such as digitalization of mobile network, mobile network coverage indicates that substantial mobile telephone growth is yet to be realized especially in developing countries. A new and important empirical finding is that there are still many opportunities available for mobile phone development in the world pro-poor nations by providing better telecom infrastructure.

  7. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    Science.gov (United States)

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  8. Anterior temporal face patches: A meta-analysis and empirical study

    Directory of Open Access Journals (Sweden)

    Rebecca J. Von Der Heide

    2013-02-01

    Full Text Available Studies of nonhuman primates have reported face sensitive patches in the ventral anterior temporal lobes (ATL. In humans, ATL resection or damage causes an associative prosopagnosia in which face perception is intact but face memory is compromised. Some fMRI studies have extended these findings using famous and familiar faces. However, it is unclear whether these regions in the human ATL are in locations comparable to those reported in non-human primates, typically using unfamiliar faces. We present the results of two studies of person memory: a meta-analysis of existing fMRI studies and an empirical fMRI study using optimized imaging parameters. Both studies showed left-lateralized ATL activations to familiar individuals while novel faces activated the right ATL. Activations to famous faces were quite ventral, similar to what has been reported in monkeys. These findings suggest that face memory-sensitive patches in the human ATL are in the ventral/polar ATL.

  9. Differences in the association between empirically derived dietary patterns and cancer: a meta-analysis.

    Science.gov (United States)

    Bella, Francesca; Godos, Justyna; Ippolito, Antonella; Di Prima, Alessia; Sciacca, Salvatore

    2017-06-01

    Plant-based dietary patterns have been associated with decreased cancer risk. The aim of the present study was to perform a meta-analysis of studies comparing empirically derived dietary patterns in relation to cancer risk. A systematic search of PubMed and EMBASE electronic databases was conducted. Eligible studies had an observational design and evaluated the association with cancer risk between a posteriori derived dietary patterns. Random-effects models were applied to calculate relative risks (RRs) of cancer between diets. Statistical heterogeneity and publication bias were explored. An increased risk of cancer for the adoption of high-meat compared to plant-based dietary patterns was found (RR =1.64, 95% CI: 1.02, 2.63). Lower risk of cancer for individuals adopting a plant-based dietary pattern over a mixed one was found (RR =0.88, 95% CI: 0.82, 0.95). In conclusion, plant-based dietary patterns can be considered a healthy choice over meat-based dietary patterns.

  10. Operational Practices and Financial Performance: an Empirical Analysis of Brazilian Manufacturing Companies

    Directory of Open Access Journals (Sweden)

    André Luís de Castro Moura Duarte

    2011-10-01

    Full Text Available In the operations management field, operational practices like total quality management or just in time have been seen as a way to improve operational performance and ultimately financial performance. Empirical support for this effect of operational practices in financial performance has been, however, limited due to research design and the inherent difficulties of using performance as a dependent variable. In this paper, we tested the relationship between selected operational practices (quality management, just in time, ISO certification and services outsourcing in financial performance outcomes of profitability and growth. A sample of 1200 firms, operating in São Paulo, Brazil, was used. Analysis using multiple regression explored the direct effect of practices and their interaction with industry dummies. Results did not support the existence of a positive relationship with financial performance. A negative relationship of outsourcing with both profitability and growth was found, supporting some critical views of the outsourcing practice. A weaker negative relationship between ISO certification and growth was also found. Some interactions between practices and industries were also significant, with mixed results, indicating that the effect of practices on performance might be context dependent.

  11. An empirical analysis of the required management skills in the core employees' identification

    Directory of Open Access Journals (Sweden)

    Natalia García Carbonell

    2016-01-01

    Full Text Available The current study empirically analyses the influence of top management team human capital attributes on one of the most relevant stages in the human resource management strategy formulation: the core employees' identification. Drawing on recent calls from the strategic human resource management literature, this study proposes a "process" perspective instead of the traditional "content" analysis, with the intention of going a step further on the internal dynamic of these strategic processes. Applying the structural equation modeling via Partial Least Square (PLS on a sample of 120 Spanish firms, results reveal that critical human resources identification processes demand mixed cognitive skills, rational and creative ones, in order to complete efficiently different steps of the process. Consequently, to reach a balanced combination of previous skills, collectivistic dynamics are needed, fostering cooperative and collaborative decision making processes. In this context, HR managers will participate improving the process with his/her expert power and developing technical HR activities; subsequently, the HR information will be integrated the strategic decision making process with the rest of the team. In addition, interesting professional implications arise from the study in relation to the presence of the cognitive diversity in top management teams.

  12. Regional Financial Development and Regional Economic Growth: An Empirical Analysis of Suzhou City, China

    Institute of Scientific and Technical Information of China (English)

    LIU Yong; LI Weiping

    2010-01-01

    There are many defects in researches on the relationship of the regional financial development(FD)and economic growth of China,such as simply assuming the causality direction,not highlighting financial institution,us-ing incomplete financial indicator,etc.This article,taking Suzhou City of Jiangsu Province,China as a case,builds a simple model to study the level of FD from three aspects of financial scale,structure and institution.Three original in-dicators of PRIVY(private investment/aggregate investment),DEPTH(aggregate loan/GDP)and FDIVG(FDI/GDP)are used to construct the FD economic indicator through Principal Component Analysis approach.Then we use Granger method to analyze the relationship between the FD and the economic growth of Suzhou.Empirical test results show that the FD of Suzhou is the Granger reason of economic growth,while economic growth is not the reason for FD,because the relationship between the FD and the economic growth of Suzhou is just in the"supply-leading"period.In terms of Suzhou experiences,the local government should strengthen the protection of private investment,improve the institutional environment,and establish the reasonable financial structure.So we can concluded that FD could play a great role in promoting economic growth at the economy takeoff stage.

  13. Investigating properties of the cardiovascular system using innovative analysis algorithms based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Yeh, Jia-Rong; Lin, Tzu-Yu; Chen, Yun; Sun, Wei-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2012-01-01

    Cardiovascular system is known to be nonlinear and nonstationary. Traditional linear assessments algorithms of arterial stiffness and systemic resistance of cardiac system accompany the problem of nonstationary or inconvenience in practical applications. In this pilot study, two new assessment methods were developed: the first is ensemble empirical mode decomposition based reflection index (EEMD-RI) while the second is based on the phase shift between ECG and BP on cardiac oscillation. Both methods utilise the EEMD algorithm which is suitable for nonlinear and nonstationary systems. These methods were used to investigate the properties of arterial stiffness and systemic resistance for a pig's cardiovascular system via ECG and blood pressure (BP). This experiment simulated a sequence of continuous changes of blood pressure arising from steady condition to high blood pressure by clamping the artery and an inverse by relaxing the artery. As a hypothesis, the arterial stiffness and systemic resistance should vary with the blood pressure due to clamping and relaxing the artery. The results show statistically significant correlations between BP, EEMD-based RI, and the phase shift between ECG and BP on cardiac oscillation. The two assessments results demonstrate the merits of the EEMD for signal analysis.

  14. Empirical Analysis of the Impact of Adjustment of Agricultural Structure on Agricultural Economic Growth in Xinjiang

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    We conduct empirical analysis of the contribution of various sectors of agriculture in Xinjiang to agricultural economic growth,and the impact of adjustment of these sectors on agricultural economic growth.The results show that the growth of farming has the greatest force to drive the growth of total agricultural output in Xinjiang,followed by animal husbandry;the rate of contribution of these two production sectors,farming and animal husbandry,not only shows high-frequency fluctuation,but also shows reverse fluctuation;the effect arising from adjustment of farming is gradually spreading to animal husbandry,forestry and fishery one by one,but the spreading rate is low.Finally the countermeasures and proposals are put forward to further adjust agricultural structure and promote agricultural economic growth of Xinjiang under the Twelfth Five-Year Plan as follows:adjust the industrial structure steadily,address the industry convergence and broaden the income-increase channels for the farmers;strengthen the input to adjustment of agricultural structure;stick to the combination of internal adjustment and external adjustment of agricultural industry.

  15. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  16. Empirical analysis and modeling of errors of atmospheric profiles from GPS radio occultation

    Directory of Open Access Journals (Sweden)

    U. Foelsche

    2011-09-01

    Full Text Available The utilization of radio occultation (RO data in atmospheric studies requires precise knowledge of error characteristics. We present results of an empirical error analysis of GPS RO bending angle, refractivity, dry pressure, dry geopotential height, and dry temperature. We find very good agreement between data characteristics of different missions (CHAMP, GRACE-A, and Formosat-3/COSMIC (F3C. In the global mean, observational errors (standard deviation from "true" profiles at mean tangent point location agree within 0.3% in bending angle, 0.1% in refractivity, and 0.2 K in dry temperature at all altitude levels between 4 km and 35 km. Above 35 km the increase of the CHAMP raw bending angle observational error is more pronounced than that of GRACE-A and F3C leading to a larger observational error of about 1% at 42 km. Above ≈20 km, the observational errors show a strong seasonal dependence at high latitudes. Larger errors occur in hemispheric wintertime and are associated mainly with background data used in the retrieval process particularly under conditions when ionospheric residual is large. The comparison between UCAR and WEGC results (both data centers have independent inversion processing chains reveals different magnitudes of observational errors in atmospheric parameters, which are attributable to different background fields used. Based on the empirical error estimates, we provide a simple analytical error model for GPS RO atmospheric parameters for the altitude range of 4 km to 35 km and up to 50 km for UCAR raw bending angle and refractivity. In the model, which accounts for vertical, latitudinal, and seasonal variations, a constant error is adopted around the tropopause region amounting to 0.8% for bending angle, 0.35% for refractivity, 0.15% for dry pressure, 10 m for dry geopotential height, and 0.7 K for dry temperature. Below this region the observational error increases following an inverse height power-law and above it increases

  17. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    Science.gov (United States)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  18. EMPIRICAL ANALYSIS OF CRISIS MANAGEMENT PRACTICES IN TOURISM ENTERPRISES IN TERMS OF ORGANIZATIONAL LEARNING

    Directory of Open Access Journals (Sweden)

    Gülsel ÇİFTÇİ

    2017-04-01

    Full Text Available In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. It is also aimed to contribute to related literature and to offer suggestions that will guide businesses and future studies. Within this context, taking account of 2016 (October data of the Ministry of Culture and Tourism of Turkey, Antalya, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in resort category, and Istanbul, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in urban hotels category, are included within the scope of this study. It’s decided to conduct this study on hotels, considering the effects of tourism industry on world economy. In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. A comprehensive literature review was conducted in the first and second part of this three-part study; The concept of crisis management in the enterprises was examined and the applications on tourism enterprises were discussed. The last part of the study contains information on testing and analyzing hypotheses. The data obtained as a result of the questionnaires were analyzed in SPSS (Statistical Package for Social Sciences and LISREL (LInear Structural RELationships program. A Pearson Correlation analysis was conducted to examine the relationship between

  19. Essays in economics of energy efficiency in residential buildings - An empirical analysis[Dissertation 17157

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, M.

    2007-07-01

    Energy efficiency in the building sector is a key element of cost-effective climate change and energy policies in most countries throughout the world. (...) However, a gap between the cost-effectiveness of energy efficiency measures, their benefits, the necessities from a societal point of view on the one hand and the actual investments in the building stock - particularly in the moment of re-investment and refurbishing - on the other hand became more and more evident. The research questions that arose against this background were whether this gap and the low energy efficiency levels and rates could be confirmed empirically and if yes, how the gap could be explained and how it could be overcome by adequate policy measures. To address these questions, the multi-functional character of buildings (i.e. well conditioned and quiet living rooms and working space) had to be considered. Associated benefits arise on the societal level (ancillary benefits) and on the private level (co-benefits), the latter being increasingly addressed by different building labels such as 'Minergie', 'Passive House', and others. It was assumed that these co-benefits are of economic relevance, but empirical evidence regarding their economic value was missing. Thus, putting these benefits into an appropriate economic appraisal framework was at stake to make use of them in market information and policy instruments, preventing uninformed and biased cost benefit analyses and decisions on the private and on the societal level. The research presented in this PhD thesis had the goal to provide a sound empirical basis about costs and benefits of energy efficiency investments in residential buildings, with a special emphasis on the economic valuation of their co-benefits from a building user perspective (owner-occupiers, purchasers and tenants). In view of long time-horizons in the building sector, the techno-economic dynamics should also be addressed. The results should be useful

  20. Incentives at the counter: An empirical analysis of surcharging card payment and payment behaviour in the Netherlands

    OpenAIRE

    Wilko Bolt; Nicole Jonker; Corry van Renselaar

    2008-01-01

    In card payment systems, no-surcharge rules prohibit merchants from charging consumers extra for card payments. However, Dutch retailers are allowed to surcharge consumers for their debit card use. This allows an empirical analysis of the impact of surcharging on the demand for debit card services, and the effect of removing the no-surcharge rule on card acceptance by retailers and on consumer payment choice. Based on consumer and retailer survey data, our analysis shows that surcharging stee...

  1. Nonparametric Bayesian drift estimation for multidimensional stochastic differential equations

    NARCIS (Netherlands)

    Gugushvili, S.; Spreij, P.

    2014-01-01

    We consider nonparametric Bayesian estimation of the drift coefficient of a multidimensional stochastic differential equation from discrete-time observations on the solution of this equation. Under suitable regularity conditions, we establish posterior consistency in this context.

  2. A non-parametric approach to investigating fish population dynamics

    National Research Council Canada - National Science Library

    Cook, R.M; Fryer, R.J

    2001-01-01

    .... Using a non-parametric model for the stock-recruitment relationship it is possible to avoid defining specific functions relating recruitment to stock size while also providing a natural framework to model process error...

  3. Non-parametric approach to the study of phenotypic stability.

    Science.gov (United States)

    Ferreira, D F; Fernandes, S B; Bruzi, A T; Ramalho, M A P

    2016-02-19

    The aim of this study was to undertake the theoretical derivations of non-parametric methods, which use linear regressions based on rank order, for stability analyses. These methods were extension different parametric methods used for stability analyses and the result was compared with a standard non-parametric method. Intensive computational methods (e.g., bootstrap and permutation) were applied, and data from the plant-breeding program of the Biology Department of UFLA (Minas Gerais, Brazil) were used to illustrate and compare the tests. The non-parametric stability methods were effective for the evaluation of phenotypic stability. In the presence of variance heterogeneity, the non-parametric methods exhibited greater power of discrimination when determining the phenotypic stability of genotypes.

  4. Cycling empirical antibiotic therapy in hospitals: meta-analysis and models.

    Directory of Open Access Journals (Sweden)

    Pia Abel zur Wiesch

    2014-06-01

    Full Text Available The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling. Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43-0.48] and resistant infections by 7.2 [14.00-0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing. We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call "adjustable cycling/mixing". In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that "adjustable cycling" is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that

  5. Risk and protective factors of internet addiction: a meta-analysis of empirical studies in Korea.

    Science.gov (United States)

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-11-01

    A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and pathological, problematic, and excessive Internet use. Only original research papers using Korean samples published from 1999 to 2012 and officially reviewed by peers were included for analysis. Ninety-five studies meeting the inclusion criteria were identified. The magnitude of the overall effect size of the intrapersonal variables associated with internet addiction was significantly higher than that of interpersonal variables. Specifically, IA demonstrated a medium to strong association with "escape from self" and "self-identity" as self-related variables. "Attention problem", "self-control", and "emotional regulation" as control and regulation-relation variables; "addiction and absorption traits" as temperament variables; "anger" and "aggression" as emotion and mood and variables; "negative stress coping" as coping variables were also associated with comparably larger effect sizes. Contrary to our expectation, the magnitude of the correlations between relational ability and quality, parental relationships and family functionality, and IA were found to be small. The strength of the association between IA and the risk and protective factors was found to be higher in younger age groups. The findings highlight a need for closer examination of psychosocial factors, especially intrapersonal variables when assessing high-risk individuals and designing intervention strategies for both general IA and Internet game addiction.

  6. Credit Market Development and Economic Growth an Empirical Analysis for United Kingdom

    Directory of Open Access Journals (Sweden)

    Athanasios Vazakidis

    2011-01-01

    Full Text Available Problem statement: This study investigated the causal relationship between credit market development and economic growth for UK for the period 1975-2007 using a Vector Error Correction Model (VECM. Questions were raised whether economic growth spurs credit market development taking into account the negative effect of inflation rate on credit market development. This study aimed to investigate the short-run and the long-run relationship between bank lending, gross domestic product and inflation rate applying the Johansen cointegration analysis. Approach: To achieve this objective classical and panel unit root tests were carried out for all time series data in their levels and their first differences. Johansen cointegration analysis was applied to examine whether the variables are cointegrated of the same order taking into account the maximum eigenvalues and trace statistics tests. Finally, a vector error correction model was selected to investigate the long-run relationship between economic growth and credit market development. Results: A short-run increase of economic growth per 1% induces an increase of bank lending 0.006%, while an increase of inflation rate per 1% induces a relative decrease of bank lending per 1.05% in UK. The estimated coefficient of error correction term is statistically significant and has a negative sign, which confirms that there is not any problem in the long-run equilibrium between the examined variables. Conclusion: The empirical results indicated that there is a unidirectional causal relationship between economic growth and credit market development with direction from economic growth to credit market development and a bilateral causality between inflation and credit market development for United Kingdom. Bank development is determined by the size of bank lending directed to private sector at times of low inflation rates leading to higher economic growth rates.

  7. Self-image and Missions of Universities: An Empirical Analysis of Japanese University Executives

    Directory of Open Access Journals (Sweden)

    Masataka Murasawa

    2014-05-01

    Full Text Available As universities in Japan gain institutional autonomy in managing internal organizations, independent of governmental control as a result of deregulation and decentralizing reforms, it is becoming increasingly important that the executives and administrators of each institution demonstrate clear and strategic vision and ideas to external stakeholders, in order to maintain financially robust operations and attractiveness of their institutions. This paper considers whether and how the self-image, mission, and vision of universities are perceived and internalized by the management of Japanese universities and empirically examines the determinants of shaping such individual perceptions. The result of our descriptive analysis indicates that the recent government policy to internationalize domestic universities has not shown much progress in the view of university executives in Japan. An increasing emphasis on the roles of serving local needs in research and teaching is rather pursued by these universities. Individual perceptions among Japanese university executives with regard to the missions and functional roles to be played by their institutions are influenced by managerial rank as well as the field of their academic training. A multiple regression analysis reveals that the economy of scale brought out by an expanded undergraduate student enrollment gradually slows down and decelerate executive perceptions, with regard to establishing a globally recognized status in research and teaching. Moreover, Japanese universities with a small proportion of graduate student enrollment, likely opted out from competitions for gaining a greater respect in the global community of higher education between 2005 and 2012. Finally, the management in universities granted with the same amount of external research funds in both studied years responded more passively in 2012 than did in 2005 on the self-assessment of whether having established a status as a global

  8. Liquidity Risk Management: An Empirical Analysis on Panel Data Analysis and ISE Banking Sector

    Directory of Open Access Journals (Sweden)

    Sibel ÇELİK

    2012-06-01

    Full Text Available In this paper, we test the factors affecting liquidity risk management in banking sector in Turkey by using panel regression analysis. We use the data for 9 commercial banks traded in Istanbul Stock Exchange for the period 1998-2008. In conclusion, we find that risky liquid assets and return on equity variables are negatively related with liquidity risk. However, external financing and return on asset variables are positively related with liquidity risk. This finding is importance for banks since it underlines the critical factors in liquidity risk management.

  9. Nonparametric Bayesian Modeling for Automated Database Schema Matching

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Laska, Jason A [ORNL

    2015-01-01

    The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.

  10. PV power forecast using a nonparametric PV model

    OpenAIRE

    Almeida, Marcelo Pinho; Perpiñan Lamigueiro, Oscar; Narvarte Fernández, Luis

    2015-01-01

    Forecasting the AC power output of a PV plant accurately is important both for plant owners and electric system operators. Two main categories of PV modeling are available: the parametric and the nonparametric. In this paper, a methodology using a nonparametric PV model is proposed, using as inputs several forecasts of meteorological variables from a Numerical Weather Forecast model, and actual AC power measurements of PV plants. The methodology was built upon the R environment and uses Quant...

  11. Empirically Analysis of the CO2 Emissions Embodied in Exports of China%Empirically Analysis of the CO2 Emissions Embodied in Exports of China

    Institute of Scientific and Technical Information of China (English)

    Zhu Qirong

    2011-01-01

    In this paper, using the input-output model, the author first calculated the CO2 emissions embodied in exports of China in 2002 and 2007. Then, the author empirically analyzed problems existing in the composition of exported products and analyzed its possible reasons. The research results of this paper are as follows: Since China's entry into WTO, the CO2 emissions embodied in exports of China have been increasing rapidly; the value of exported products of high-carbon emissions industries accounts for a relatively higher proportion to China's total exports value because China's carbon intensive products have a certain competitive advantage. Additionally, this paper has put forward relevant suggestions based on these results.

  12. Storm-coverage effect on dynamic flood-frequency analysis: empirical data analysis

    Science.gov (United States)

    Moon, Jangwon; Kim, Joong-Hoon; Yoo, Chulsang

    2004-01-01

    In this study, a dynamic flood-frequency analysis model considering the storm coverage effect is proposed and applied to six sub-basins in the Pyungchang River basin, Korea. The model proposed is composed of the rectangular pulse Poisson process model for rainfall, the Soil Conservation Service curve number method for infiltration and the geomorphoclimatic instantaneous unit hydrograph for runoff estimation. Also, the model developed by Marco and Valdes is adopted for quantifying the storm-coverage characteristics. By comparing the results from the same model with and without the storm-coverage effect consideration, we could quantify the storm-coverage effect on the flood-frequency analysis. As a result of that, we found the storm-coverage effect was so significant that overestimation of the design flood was unavoidable without its consideration. This also becomes more serious for larger basins where the probability of complete storm coverage is quite low. However, for smaller basins, the limited number of rain gauges is found to hamper the proper quantification of the storm-coverage characteristics. Provided with a relationship curve between the basin size and the storm coverage (as in this study), this problem could be overcome with an acceptable accuracy level.

  13. The Revealed Preference Approach to Collective Consumption Behavior : Testing, Recovery, and Welfare Analysis (Replaced by DP 2009-68)

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Vermeulen, F.M.P.

    2007-01-01

    We extend the nonparametric 'revealed preference' methodology for analyzing collective consumption behavior (with consumption externalities and public consumption), to ren- der it useful for empirical applications that deal with welfare-related questions. First, we provide a nonparametric necessary

  14. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  15. An empirical analysis of ERP adoption by oil and gas firms

    Science.gov (United States)

    Romero, Jorge

    2005-07-01

    Despite the growing popularity of enterprise-resource-planning (ERP) systems for the information technology infrastructure of large and medium-sized businesses, there is limited empirical evidence on the competitive benefits of ERP implementations. Case studies of individual firms provide insights but do not provide sufficient evidence to draw reliable inferences and cross-sectional studies of firms in multiple industries provide a broad-brush perspective of the performance effects associated with ERP installations. To narrow the focus to a specific competitive arena, I analyze the impact of ERP adoption on various dimensions of performance for firms in the Oil and Gas Industry. I selected the Oil and Gas Industry because several companies installed a specific type of ERP system, SAP R/3, during the period from 1990 to 2002. In fact, SAP was the dominant provider of enterprise software to oil and gas companies during this period. I evaluate performance of firms that implemented SAP R/3 relative to firms that did not adopt ERP systems in the pre-implementation, implementation and post-implementation periods. My analysis takes two different approaches, the first from a financial perspective and the second from a strategic perspective. Using the Sloan (General Motors) model commonly applied in financial statement analysis, I examine changes in performance for ERP-adopting firms versus non-adopting firms along the dimensions of asset utilization and return on sales. Asset utilization is more closely aligned with changes in leanness of operations, and return on sales is more closely aligned with customer-value-added. I test hypotheses related to the timing and magnitude of the impact of ERP implementation with respect to leanness of operations and customer value added. I find that SAP-adopting companies performed relatively better in terms of asset turnover than non-SAP-adopting companies during both the implementation and post-implementation periods and that SAP

  16. Voluntary Participation in Community Economic Development in Canada: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Laura Lamb

    2011-01-01

    Full Text Available This article is an empirical analysis of an individual's decision to participate in community economic development (CED initiatives in Canada. The objective of the analysis is to better understand how individuals make decisions to volunteer time toward CED initiatives and to determine whether the determinants of participation in CED are unique when compared to those of participation in volunteer activities in general. The dataset employed is Statistics Canada's 2004 Canada Survey of Giving, Volunteering and Participating (CSGVP. To date, there has been no prior econometric analysis of the decision to participate in community economic development initiatives in Canada. Results suggest a role for both public policymakers and practitioners in influencing participation in CED. / Cet article constitue une analyse empirique du processus de prise de décision chez les individus en ce qui a trait à la participation aux initiatives canadiennes de développement économique communautaire (DÉC. Le but de l'analyse est de mieux comprendre la façon dont les individus prennent la décision de consacrer du temps au bénévolat dans les initiatives de DÉC. Elle sert aussi à trancher la question de savoir si les facteurs de participation aux initiatives de développement économique communautaire sont uniques ou communs à la participation à des activités bénévoles en général. Les données employées dans le cadre de cette analyse sont puisées de l'Enquête canadienne sur le don, le bénévolat et la participation effectuée par Statistique Canada en 2004. À ce jour, aucune analyse économétrique n'a été menée sur la décision de participer aux initiatives canadiennes de DÉC. Les résultats suggèrent que les responsables de l'élaboration des politiques ainsi que les praticiens influencent tous deux la participation aux initiatives de DÉC.

  17. Analysis of surface soil moisture patterns in agricultural landscapes using Empirical Orthogonal Functions

    Directory of Open Access Journals (Sweden)

    W. Korres

    2010-05-01

    Full Text Available Soil moisture is one of the fundamental variables in hydrology, meteorology and agriculture. Nevertheless, its spatio-temporal patterns in agriculturally used landscapes that are affected by multiple natural (rainfall, soil, topography etc. and agronomic (fertilisation, soil management etc. factors are often not well known. The aim of this study is to determine the dominant factors governing the spatio-temporal patterns of surface soil moisture in a grassland and an arable test site that are located within the Rur catchment in Western Germany. Surface soil moisture (0–6 cm was measured in an approx. 50×50 m grid during 14 and 17 measurement campaigns (May 2007 to November 2008 in both test sites. To analyse the spatio-temporal patterns of surface soil moisture, an Empirical Orthogonal Function (EOF analysis was applied and the results were correlated with parameters derived from topography, soil, vegetation and land management to link the patterns to related factors and processes. For the grassland test site, the analysis resulted in one significant spatial structure (first EOF, which explained 57.5% of the spatial variability connected to soil properties and topography. The statistical weight of the first spatial EOF is stronger on wet days. The highest temporal variability can be found in locations with a high percentage of soil organic carbon (SOC. For the arable test site, the analysis resulted in two significant spatial structures, the first EOF, which explained 38.4% of the spatial variability, and showed a highly significant correlation to soil properties, namely soil texture and soil stone content. The second EOF, which explained 28.3% of the spatial variability, is linked to differences in land management. The soil moisture in the arable test site varied more strongly during dry and wet periods at locations with low porosity. The method applied is capable of identifying the dominant parameters controlling spatio-temporal patterns of

  18. Capital structure and value firm: an empirical analysis of abnormal returns

    Directory of Open Access Journals (Sweden)

    Faris Nasif AL-SHUBIRI

    2010-12-01

    Full Text Available This study investigates whether capital structure is value relevant for the equity investor. In this sense, the paper links empirical corporate finance issues with investment analysis. This study also integrates the Miller-Modigliani (MM framework (1958 into an investment approach by estimating abnormal returns on leverage portfolios in the time-series for different risk classes. For most risk classes, abnormal returns decline in firm leverage. Descriptive statistics, simple and multiple regressions are used to test the hold indicator significance. The results reflect that the designed measures are the negative relationship between returns and leverage could also be due to the market’s pricing of the firm’s ability to raise funds if need be. Further avenues for research in this area include examining the stock return performance of companies based on the changes in leverage of the firms relative to their risk classes. It would be particularly noteworthy to examine the rate at which the information content of said changes is incorporated in the share prices of companies as well as in their long run returns This study encompasses all non-financial firms across the five sectors that cover all the various classes of risk. This study investigates neither the determinants of multiple capital structure choices nor changes in capital structures over time. Our main goal is to explore the effect of capital structure on cumulative abnormal returns. This study also examine a firm’s cumulative average abnormal returns by measuring leverage at the firm level and at the average level for the firm’s industry. And also examine other factors, such as size, price earnings, market-to-book and betas.

  19. Financial incentives and psychiatric services in Australia: an empirical analysis of three policy changes.

    Science.gov (United States)

    Doessel, D P; Scheurer, Roman W; Chant, David C; Whiteford, Harvey

    2007-01-01

    Australia has a national, compulsory and universal health insurance scheme, called Medicare. In 1996 the Government changed the Medicare Benefit Schedule Book in such a way as to create different financial incentives for consumers or producers of out-of-hospital private psychiatric services, once an individual consumer had received 50 such services in a 12-month period. The Australian Government introduced a new Item (319) to cover some special cases that were affected by the policy change. At the same time, the Commonwealth introduced a 'fee-freeze' for all medical services. The purpose of this study is two-fold. First, it is necessary to describe the three policy interventions (the constraints on utilization, the operation of the new Item and the general 'fee-freeze'.) The new Item policy was essentially a mechanism to 'dampen' the effect of the 'constraint' policy, and these two policy changes will be consequently analysed as a single intervention. The second objective is to evaluate the policy intervention in terms of the (stated) Australian purpose of reducing utilization of psychiatric services, and thus reducing financial outlays. Thus, it is important to separate out the different effects of the three policies that were introduced at much the same time in November 1996 and January 1997. The econometric results indicate that the composite policy change (constraining services and the new 319 Item) had a statistically significant effect. The analysis of the Medicare Benefit (in constant prices) indicates that the 'fee-freeze' policy also had a statistically significant effect. This enables separate determination of the several policy changes. In fact, the empirical results indicate that the Commonwealth Government underestimated the 'savings' that would arise from the 'constraint' policy.

  20. Analysis of entropies based on empirical mode decomposition in amnesic mild cognitive impairment of diabetes mellitus

    Directory of Open Access Journals (Sweden)

    Dong Cui

    2015-09-01

    Full Text Available EEG characteristics that correlate with the cognitive functions are important in detecting mild cognitive impairment (MCI in T2DM. To investigate the complexity between aMCI group and age-matched non-aMCI control group in T2DM, six entropies combining empirical mode decomposition (EMD, including Approximate entropy (ApEn, Sample entropy (SaEn, Fuzzy entropy (FEn, Permutation entropy (PEn, Power spectrum entropy (PsEn and Wavelet entropy (WEn were used in the study. A feature extraction technique based on maximization of the area under the curve (AUC and a support vector machine (SVM were subsequently used to for features selection and classification. Finally, Pearson's linear correlation was employed to study associations between these entropies and cognitive functions. Compared to other entropies, FEn had a higher classification accuracy, sensitivity and specificity of 68%, 67.1% and 71.9%, respectively. Top 43 salient features achieved classification accuracy, sensitivity and specificity of 73.8%, 72.3% and 77.9%, respectively. P4, T4 and C4 were the highest ranking salient electrodes. Correlation analysis showed that FEn based on EMD was positively correlated to memory at electrodes F7, F8 and P4, and PsEn based on EMD was positively correlated to Montreal cognitive assessment (MoCA and memory at electrode T4. In sum, FEn based on EMD in right-temporal and occipital regions may be more suitable for early diagnosis of the MCI with T2DM.

  1. Empirical Analysis of Virtual Carrier Sense Flooding Attacks Over Wireless Local Area Network

    Directory of Open Access Journals (Sweden)

    Mina Malekzadeh

    2009-01-01

    Full Text Available Problem statement: Wireless Local Areas (WLANs are subject to different types of vulnerabilities. Denial of Service (DoS attack is the most current challenging issue on the WLANs. The objectives of the study were to (i Provide an empirical analysis to conduct a series of wireless virtual carrier sense DoS attacks using wireless control frames vulnerabilities, (ii Design a testbed to compared and analyzed the damage that these attacks can imposed on wireless networks, and (iii Evaluated the effectiveness of such attacks on performance of WLAN in term of data transmission rate. Approach: The testbed employed ubuntu distribution along a network analyzer, Atheros chipset, and frame injection to the tested WLAN. All experiments were placed on two phases: Targeting wireless access point and targeting wireless client. Each phase presented the results of experiments under three circumstances: Before, during, and after the attacks. Results: Even when virtual carrier sense communication was disabled in the tested WLAN, still the target nodes answered to these forgery frames which made the attacks easier. Attacks over the wireless clients were more effective than the access point. In VCS-RTS-C the rate of data transmission from 3547.384 B sec1 decreased to 9.185 B sec1. In contrast with VCS-CTS-C, it decreased from 4959.887-44.740 B sec1 and amount of decrease for VCS-ACK-C was from 7057.401-136.96 B sec1. The obtained results demonstrated that during the attacks the target clients were completely disconnected from the wireless network and unable to do any communication. Conclusion: The influence of wireless virtual carrier sense attacks on performance of the wireless network was analyzed. The data transmission rate of the tested WLAN under the attacks was compared with the transmission rate of the WLAN operated under normal conditions. The obtained results confirmed the attacks could easily overwhelmed and shut down the wireless network.

  2. Multiscale Detrended Cross-Correlation Analysis of Traffic Time Series Based on Empirical Mode Decomposition

    Science.gov (United States)

    Yin, Yi; Shang, Pengjian

    2015-04-01

    In this paper, we propose multiscale detrended cross-correlation analysis (MSDCCA) to detect the long-range power-law cross-correlation of considered signals in the presence of nonstationarity. For improving the performance and getting better robustness, we further introduce the empirical mode decomposition (EMD) to eliminate the noise effects and propose MSDCCA method combined with EMD, which is called MS-EDXA method, then systematically investigate the multiscale cross-correlation structure of the real traffic signals. We apply the MSDCCA and MS-EDXA methods to study the cross-correlations in three situations: velocity and volume on one lane, velocities on the present and the next moment and velocities on the adjacent lanes, and further compare their spectrums respectively. When the difference between the spectrums of MSDCCA and MS-EDXA becomes unobvious, there is a crossover which denotes the turning point of difference. The crossover results from the competition between the noise effects in the original signals and the intrinsic fluctuation of traffic signals and divides the plot of spectrums into two regions. In all the three case, MS-EDXA method makes the average of local scaling exponents increased and the standard deviation decreased and provides a relative stable persistent scaling cross-correlated behavior which gets the analysis more precise and more robust and improves the performance after noises being removed. Applying MS-EDXA method avoids the inaccurate characteristics of multiscale cross-correlation structure at the short scale including the spectrum minimum, the range for the spectrum fluctuation and general trend, which are caused by the noise in the original signals. We get the conclusions that the traffic velocity and volume are long-range cross-correlated, which is accordant to their actual evolution, while velocities on the present and the next moment and velocities on adjacent lanes reflect the strong cross-correlations both in temporal and

  3. Contribution of Rural Women to Family Income Through Participation in Microcredit: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Ferdoushi Ahmed

    2011-01-01

    Full Text Available Problem statement: Rural women in Bangladesh have a lower socio-economic status and very limited access to income generating activities due to a number of social, cultural and religious barriers. Consequently, they have less opportunity to contribute to their family income. Rural women are economically dependent and vulnerable and socially discriminated. Microcredit programme provides loans to the rural poor women in order to undertake small financial and business activities that allow them to generate income. This income earning opportunity helps the rural women to contribute to their family income and achieve a level of independence. Approach: In the present study, an attempt has been made to assess the impact of microcredit programme on rural womens contribution in improving the household income. The study is based on empirical data collected through interview from the two groups of rural women e.g. with credit and without credit rural women. The with credit respondents represent the rural women who have taken loan from the Grammeen Banks microcredit programme. The results show that the proportion of the with credit rural women who contributed to family income is much higher (19% than that of without credit rural women (10%. A multiple regression analysis was conducted to identify the factors influencing the respondents contribution to the total monthly family income. Results: The multiple regression analysis shows that there were strong positive effects of age of respondent, level of education, family size, earning member, occupation of respondents and also monthly income of respondents while status of marriage has a strong negative effect. It was found that majority of the with credit respondents contribute much higher to the family incomes than the without credit respondents. It was also found that with credit rural women have improved their socio-economic status and income generating activities by participating

  4. Cycling Empirical Antibiotic Therapy in Hospitals: Meta-Analysis and Models

    Science.gov (United States)

    Abel, Sören; Viechtbauer, Wolfgang; Bonhoeffer, Sebastian

    2014-01-01

    The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling). Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43–0.48] and resistant infections by 7.2 [14.00–0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing). We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call “adjustable cycling/mixing”. In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that “adjustable cycling” is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that

  5. Empirical Analysis of Value-at-Risk Estimation Methods Using Extreme Value Theory

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). Itcompares two different estimation methods, 。two-step subsample bootstrap" based on moment estimation and maximumlikelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimationresults are analyzed together with those of normal method and empirical method. The empirical research of foreignexchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and"two-step subsample bootstrap" method is preferable to MLE.

  6. Empire Redux

    DEFF Research Database (Denmark)

    Mercau, Ezequiel

    The Falklands War was shrouded in symbolism and permeated with imperial rhetoric, imagery and memories, bringing to the fore divergent conceptions of Britishness, kinship and belonging. The current dispute, moreover, is frequently framed in terms of national identity, and accusations of imperialism...... and neo-colonialism persist. Thus in dealing with the conflict, historians and commentators alike have often made references to imperial legacies, yet this has rarely been afforded much careful analysis. Views on this matter continue to be entrenched, either presenting the war as a throwback to nineteenth...... from the legacies of empire. Taking decolonization as a starting point, this thesis demonstrates how the idea of a ‘British world’ gained a new lease of life vis-à-vis the Falklands, as the Islanders adopted the rhetorical mantle of ‘abandoned Britons’. Yet this new momentum was partial and fractured...

  7. An Empirical Analysis of the Export Competitiveness of Agricultural Products in Hubei Province Based on Inter-provincial Comparison

    Institute of Scientific and Technical Information of China (English)

    Ling WANG

    2016-01-01

    With the six provinces of Central China and China’s six major provinces of exporting agricultural products as the reference objects,this paper uses revealed comparative advantage index and export growth advantage index to perform the empirical analysis and comparison on the export competitiveness of agricultural products in Hubei Province,and finally makes the corresponding policy recommendations in order to enhance the export competitiveness of agricultural products in Hubei Province.

  8. The IT Impact on the Productivity and the Organizational Performance of Firms in Romania. A model of Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Cristina ENACHE

    2011-11-01

    Full Text Available The paper propose an analysis based on an empirical model of IT impact on firms performances of Romania. There are presented the model, the equations of the model and the results of statistical processing. All these shown that the ICT impact on firm performance is greater and positive if the information technologies are accompanied by a proactive management policy and an organizational culture.

  9. On the Data Mining Technology Applied to Active Marketing Model of International Luxury Marketing Strategy in China— An Empirical Analysis

    OpenAIRE

    Qishen Zhou; Shanhui Wang; Zuowei Yin

    2013-01-01

     This paper emphasizes the importance of active marketing in the customer relationship management. Especially, the data mining technology is applied to establish an active marketing model to empirically analyze the condition of the AH Jewelry Company. Michael Porter's Five Forces Model is employed to assess and calculate the similarity in the active marketing model. Then, the questionnaire analysis on the customer relationship management model is carried out to explain the target market and t...

  10. Empirical Bayes ranking and selection methods via semiparametric hierarchical mixture models in microarray studies.

    Science.gov (United States)

    Noma, Hisashi; Matsui, Shigeyuki

    2013-05-20

    The main purpose of microarray studies is screening of differentially expressed genes as candidates for further investigation. Because of limited resources in this stage, prioritizing genes are relevant statistical tasks in microarray studies. For effective gene selections, parametric empirical Bayes methods for ranking and selection of genes with largest effect sizes have been proposed (Noma et al., 2010; Biostatistics 11: 281-289). The hierarchical mixture model incorporates the differential and non-differential components and allows information borrowing across differential genes with separation from nuisance, non-differential genes. In this article, we develop empirical Bayes ranking methods via a semiparametric hierarchical mixture model. A nonparametric prior distribution, rather than parametric prior distributions, for effect sizes is specified and estimated using the "smoothing by roughening" approach of Laird and Louis (1991; Computational statistics and data analysis 12: 27-37). We present applications to childhood and infant leukemia clinical studies with microarrays for exploring genes related to prognosis or disease progression.

  11. 非参数判别模型%Nonparametric discriminant model

    Institute of Scientific and Technical Information of China (English)

    谢斌锋; 梁飞豹

    2011-01-01

    提出了一类新的判别分析方法,主要思想是将非参数回归模型推广到判别分析中,形成相应的非参数判别模型.通过实例与传统判别法相比较,表明非参数判别法具有更广泛的适用性和较高的回代正确率.%In this paper, the author puts forth a new class of discriminant method, which the main idea is applied non- parametric regression model to discriminant analysis and forms the corresponding nonparametric discriminant model. Compared with the traditional discriminant methods by citing an example, the nonparametric discriminant method has more comprehensive adaptability and higher correct rate of back subsitution.

  12. Floating Car Data Based Nonparametric Regression Model for Short-Term Travel Speed Prediction

    Institute of Scientific and Technical Information of China (English)

    WENG Jian-cheng; HU Zhong-wei; YU Quan; REN Fu-tian

    2007-01-01

    A K-nearest neighbor (K-NN) based nonparametric regression model was proposed to predict travel speed for Beijing expressway. By using the historical traffic data collected from the detectors in Beijing expressways, a specically designed database was developed via the processes including data filtering, wavelet analysis and clustering. The relativity based weighted Euclidean distance was used as the distance metric to identify the K groups of nearest data series. Then, a K-NN nonparametric regression model was built to predict the average travel speeds up to 6 min into the future. Several randomly selected travel speed data series,collected from the floating car data (FCD) system, were used to validate the model. The results indicate that using the FCD, the model can predict average travel speeds with an accuracy of above 90%, and hence is feasible and effective.

  13. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2017-01-18

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Nonparametric identification of structural modifications in Laplace domain

    Science.gov (United States)

    Suwała, G.; Jankowski, Ł.

    2017-02-01

    This paper proposes and experimentally verifies a Laplace-domain method for identification of structural modifications, which (1) unlike time-domain formulations, allows the identification to be focused on these parts of the frequency spectrum that have a high signal-to-noise ratio, and (2) unlike frequency-domain formulations, decreases the influence of numerical artifacts related to the particular choice of the FFT exponential window decay. In comparison to the time-domain approach proposed earlier, advantages of the proposed method are smaller computational cost and higher accuracy, which leads to reliable performance in more difficult identification cases. Analytical formulas for the first- and second-order sensitivity analysis are derived. The approach is based on a reduced nonparametric model, which has the form of a set of selected structural impulse responses. Such a model can be collected purely experimentally, which obviates the need for design and laborious updating of a parametric model, such as a finite element model. The approach is verified experimentally using a 26-node lab 3D truss structure and 30 identification cases of a single mass modification or two concurrent mass modifications.

  15. Adaptive Neural Network Nonparametric Identifier With Normalized Learning Laws.

    Science.gov (United States)

    Chairez, Isaac

    2016-04-05

    This paper addresses the design of a normalized convergent learning law for neural networks (NNs) with continuous dynamics. The NN is used here to obtain a nonparametric model for uncertain systems described by a set of ordinary differential equations. The source of uncertainties is the presence of some external perturbations and poor knowledge of the nonlinear function describing the system dynamics. A new adaptive algorithm based on normalized algorithms was used to adjust the weights of the NN. The adaptive algorithm was derived by means of a nonstandard logarithmic Lyapunov function (LLF). Two identifiers were designed using two variations of LLFs leading to a normalized learning law for the first identifier and a variable gain normalized learning law. In the case of the second identifier, the inclusion of normalized learning laws yields to reduce the size of the convergence region obtained as solution of the practical stability analysis. On the other hand, the velocity of convergence for the learning laws depends on the norm of errors in inverse form. This fact avoids the peaking transient behavior in the time evolution of weights that accelerates the convergence of identification error. A numerical example demonstrates the improvements achieved by the algorithm introduced in this paper compared with classical schemes with no-normalized continuous learning methods. A comparison of the identification performance achieved by the no-normalized identifier and the ones developed in this paper shows the benefits of the learning law proposed in this paper.

  16. Essays in economics of energy efficiency in residential buildings - An empirical analysis[Dissertation 17157

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, M.

    2007-07-01

    Energy efficiency in the building sector is a key element of cost-effective climate change and energy policies in most countries throughout the world. (...) However, a gap between the cost-effectiveness of energy efficiency measures, their benefits, the necessities from a societal point of view on the one hand and the actual investments in the building stock - particularly in the moment of re-investment and refurbishing - on the other hand became more and more evident. The research questions that arose against this background were whether this gap and the low energy efficiency levels and rates could be confirmed empirically and if yes, how the gap could be explained and how it could be overcome by adequate policy measures. To address these questions, the multi-functional character of buildings (i.e. well conditioned and quiet living rooms and working space) had to be considered. Associated benefits arise on the societal level (ancillary benefits) and on the private level (co-benefits), the latter being increasingly addressed by different building labels such as 'Minergie', 'Passive House', and others. It was assumed that these co-benefits are of economic relevance, but empirical evidence regarding their economic value was missing. Thus, putting these benefits into an appropriate economic appraisal framework was at stake to make use of them in market information and policy instruments, preventing uninformed and biased cost benefit analyses and decisions on the private and on the societal level. The research presented in this PhD thesis had the goal to provide a sound empirical basis about costs and benefits of energy efficiency investments in residential buildings, with a special emphasis on the economic valuation of their co-benefits from a building user perspective (owner-occupiers, purchasers and tenants). In view of long time-horizons in the building sector, the techno-economic dynamics should also be addressed. The results should be useful

  17. A Non-parametric Approach to Measuring the $k^{-}\\pi^{+}$ Amplitudes in $D^{+} \\to K^{-}K^{+}\\pi{+}$ Decay

    CERN Document Server

    Link, J M; Alimonti, G; Anjos, J C; Arena, V; Barberis, S; Bediaga, I; Benussi, L; Bianco, S; Boca, G; Bonomi, G; Boschini, M; Butler, J N; Carrillo, S; Casimiro, E; Castromonte, C; Cawlfield, C; Cerutti, A; Cheung, H W K; Chiodini, G; Cho, K; Chung, Y S; Cinquini, L; Cuautle, E; Cumalat, J P; D'Angelo, P; Davenport, T F; De Miranda, J M; Di Corato, M; Dini, P; Dos Reis, A C; Edera, L; Engh, D; Erba, S; Fabbri, F L; Frisullo, V; Gaines, I; Garbincius, P H; Gardner, R; Garren, L A; Gianini, G; Gottschalk, E; Göbel, C; Handler, T; Hernández, H; Hosack, M; Inzani, P; Johns, W E; Kang, J S; Kasper, P H; Kim, D Y; Ko, B R; Kreymer, A E; Kryemadhi, A; Kutschke, R; Kwak, J W; Lee, K B; Leveraro, F; Liguori, G; Lopes-Pegna, D; Luiggi, E; López, A M; Machado, A A; Magnin, J; Malvezzi, S; Massafferri, A; Menasce, D; Merlo, M M; Mezzadri, M; Mitchell, R; Moroni, L; Méndez, H; Nehring, M; O'Reilly, B; Otalora, J; Pantea, D; Paris, A; Park, H; Pedrini, D; Pepe, I M; Polycarpo, E; Pontoglio, C; Prelz, F; Quinones, J; Rahimi, A; Ramírez, J E; Ratti, S P; Reyes, M; Riccardi, C; Rovere, M; Sala, S; Segoni, I; Sheaff, M; Sheldon, P D; Stenson, K; Sánchez-Hernández, A; Uribe, C; Vaandering, E W; Vitulo, P; Vázquez, F; Wang, M; Webster, M; Wilson, J R; Wiss, J; Yager, P M; Zallo, A; Zhang, Y

    2007-01-01

    Using a large sample of \\dpkkpi{} decays collected by the FOCUS photoproduction experiment at Fermilab, we present the first non-parametric analysis of the \\kpi{} amplitudes in \\dpkkpi{} decay. The technique is similar to the technique used for our non-parametric measurements of the \\krzmndk{} form factors. Although these results are in rough agreement with those of E687, we observe a wider S-wave contribution for the \\ksw{} contribution than the standard, PDG \\cite{pdg} Breit-Wigner parameterization. We have some weaker evidence for the existence of a new, D-wave component at low values of the $K^- \\pi^+$ mass.

  18. Nonparametric Kernel Smoothing Methods. The sm library in Xlisp-Stat

    Directory of Open Access Journals (Sweden)

    Luca Scrucca

    2001-06-01

    Full Text Available In this paper we describe the Xlisp-Stat version of the sm library, a software for applying nonparametric kernel smoothing methods. The original version of the sm library was written by Bowman and Azzalini in S-Plus, and it is documented in their book Applied Smoothing Techniques for Data Analysis (1997. This is also the main reference for a complete description of the statistical methods implemented. The sm library provides kernel smoothing methods for obtaining nonparametric estimates of density functions and regression curves for different data structures. Smoothing techniques may be employed as a descriptive graphical tool for exploratory data analysis. Furthermore, they can also serve for inferential purposes as, for instance, when a nonparametric estimate is used for checking a proposed parametric model. The Xlisp-Stat version includes some extensions to the original sm library, mainly in the area of local likelihood estimation for generalized linear models. The Xlisp-Stat version of the sm library has been written following an object-oriented approach. This should allow experienced Xlisp-Stat users to implement easily their own methods and new research ideas into the built-in prototypes.

  19. Managing Human Resource Capabilities for Sustainable Competitive Advantage: An Empirical Analysis from Indian Global Organisations

    Science.gov (United States)

    Khandekar, Aradhana; Sharma, Anuradha

    2005-01-01

    Purpose: The purpose of this article is to examine the role of human resource capability (HRC) in organisational performance and sustainable competitive advantage (SCA) in Indian global organisations. Design/Methodology/Approach: To carry out the present study, an empirical research on a random sample of 300 line or human resource managers from…

  20. Empirical analysis of an in-car speed, headway and lane use Advisory system

    NARCIS (Netherlands)

    Schakel, W.J.; Van Arem, B.; Van Lint, J.W.C.

    2014-01-01

    For a recently developed in-car speed, headway and lane use advisory system, this paper investigates empirically advice validity (advice given in correct traffic circumstances), credibility (advice logical to drivers) and frequency. The system has been developed to optimize traffic flow by giving