WorldWideScience

Sample records for nonparametric empirical analysis

  1. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  2. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers

    Directory of Open Access Journals (Sweden)

    Stochl Jan

    2012-06-01

    Full Text Available Abstract Background Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Methods Scalability of data from 1 a cross-sectional health survey (the Scottish Health Education Population Survey and 2 a general population birth cohort study (the National Child Development Study illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. Results and conclusions After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items we show that all items from the 12-item General Health Questionnaire (GHQ-12 – when binary scored – were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech’s “well-being” and “distress” clinical scales. An illustration of ordinal item analysis

  3. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers.

    Science.gov (United States)

    Stochl, Jan; Jones, Peter B; Croudace, Tim J

    2012-06-11

    Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related) Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Scalability of data from 1) a cross-sectional health survey (the Scottish Health Education Population Survey) and 2) a general population birth cohort study (the National Child Development Study) illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items) we show that all items from the 12-item General Health Questionnaire (GHQ-12)--when binary scored--were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech's "well-being" and "distress" clinical scales). An illustration of ordinal item analysis confirmed that all 14 positively worded items of the Warwick-Edinburgh Mental

  4. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  5. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  6. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  7. Nonparametric analysis of blocked ordered categories data: some examples revisited

    Directory of Open Access Journals (Sweden)

    O. Thas

    2006-08-01

    Full Text Available Nonparametric analysis for general block designs can be given by using the Cochran-Mantel-Haenszel (CMH statistics. We demonstrate this with four examples and note that several well-known nonparametric statistics are special cases of CMH statistics.

  8. A nonparametric empirical Bayes framework for large-scale multiple testing.

    Science.gov (United States)

    Martin, Ryan; Tokdar, Surya T

    2012-07-01

    We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.

  9. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  10. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  11. Weak Disposability in Nonparametric Production Analysis with Undesirable Outputs

    NARCIS (Netherlands)

    Kuosmanen, T.K.

    2005-01-01

    Environmental Economics and Natural Resources Group at Wageningen University in The Netherlands Weak disposability of outputs means that firms can abate harmful emissions by decreasing the activity level. Modeling weak disposability in nonparametric production analysis has caused some confusion.

  12. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  13. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  14. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  15. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  16. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  17. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  18. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  19. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  20. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    Science.gov (United States)

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  1. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  2. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  3. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...

  4. Non-Parametric Analysis of Rating Transition and Default Data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...

  5. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  7. Nonparametric and group-based person-fit statistics : a validity study and an empirical example

    NARCIS (Netherlands)

    Meijer, R.R.

    1994-01-01

    In person-fit analysis, the object is to investigate whether an item score pattern is improbable given the item score patterns of the other persons in the group or given what is expected on the basis of a test model. In this study, several existing group-based statistics to detect such improbable

  8. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    Science.gov (United States)

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Sensitivity of Technical Efficiency Estimates to Estimation Methods: An Empirical Comparison of Parametric and Non-Parametric Approaches

    OpenAIRE

    de-Graft Acquah, Henry

    2014-01-01

    This paper highlights the sensitivity of technical efficiency estimates to estimation approaches using empirical data. Firm specific technical efficiency and mean technical efficiency are estimated using the non parametric Data Envelope Analysis (DEA) and the parametric Corrected Ordinary Least Squares (COLS) and Stochastic Frontier Analysis (SFA) approaches. Mean technical efficiency is found to be sensitive to the choice of estimation technique. Analysis of variance and Tukey’s test sugge...

  10. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  11. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  12. Empirical Methods for Detecting Regional Trends and Other Spatial Expressions in Antrim Shale Gas Productivity, with Implications for Improving Resource Projections Using Local Nonparametric Estimation Techniques

    Science.gov (United States)

    Coburn, T.C.; Freeman, P.A.; Attanasi, E.D.

    2012-01-01

    The primary objectives of this research were to (1) investigate empirical methods for establishing regional trends in unconventional gas resources as exhibited by historical production data and (2) determine whether or not incorporating additional knowledge of a regional trend in a suite of previously established local nonparametric resource prediction algorithms influences assessment results. Three different trend detection methods were applied to publicly available production data (well EUR aggregated to 80-acre cells) from the Devonian Antrim Shale gas play in the Michigan Basin. This effort led to the identification of a southeast-northwest trend in cell EUR values across the play that, in a very general sense, conforms to the primary fracture and structural orientations of the province. However, including this trend in the resource prediction algorithms did not lead to improved results. Further analysis indicated the existence of clustering among cell EUR values that likely dampens the contribution of the regional trend. The reason for the clustering, a somewhat unexpected result, is not completely understood, although the geological literature provides some possible explanations. With appropriate data, a better understanding of this clustering phenomenon may lead to important information about the factors and their interactions that control Antrim Shale gas production, which may, in turn, help establish a more general protocol for better estimating resources in this and other shale gas plays. ?? 2011 International Association for Mathematical Geology (outside the USA).

  13. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    Science.gov (United States)

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  14. Data analysis with small samples and non-normal data nonparametrics and other strategies

    CERN Document Server

    Siebert, Carl F

    2017-01-01

    Written in everyday language for non-statisticians, this book provides all the information needed to successfully conduct nonparametric analyses. This ideal reference book provides step-by-step instructions to lead the reader through each analysis, screenshots of the software and output, and case scenarios to illustrate of all the analytic techniques.

  15. Nonparametric inference in nonlinear principal components analysis : exploration and beyond

    NARCIS (Netherlands)

    Linting, Mariëlle

    2007-01-01

    In the social and behavioral sciences, data sets often do not meet the assumptions of traditional analysis methods. Therefore, nonlinear alternatives to traditional methods have been developed. This thesis starts with a didactic discussion of nonlinear principal components analysis (NLPCA),

  16. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  17. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  18. Multilevel Latent Class Analysis: Parametric and Nonparametric Models

    Science.gov (United States)

    Finch, W. Holmes; French, Brian F.

    2014-01-01

    Latent class analysis is an analytic technique often used in educational and psychological research to identify meaningful groups of individuals within a larger heterogeneous population based on a set of variables. This technique is flexible, encompassing not only a static set of variables but also longitudinal data in the form of growth mixture…

  19. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  20. Driving Style Analysis Using Primitive Driving Patterns With Bayesian Nonparametric Approaches

    OpenAIRE

    Wang, Wenshuo; Xi, Junqiang; Zhao, Ding

    2017-01-01

    Analysis and recognition of driving styles are profoundly important to intelligent transportation and vehicle calibration. This paper presents a novel driving style analysis framework using the primitive driving patterns learned from naturalistic driving data. In order to achieve this, first, a Bayesian nonparametric learning method based on a hidden semi-Markov model (HSMM) is introduced to extract primitive driving patterns from time series driving data without prior knowledge of the number...

  1. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  2. Nonparametric Bounds and Sensitivity Analysis of Treatment Effects

    Science.gov (United States)

    Richardson, Amy; Hudgens, Michael G.; Gilbert, Peter B.; Fine, Jason P.

    2015-01-01

    This paper considers conducting inference about the effect of a treatment (or exposure) on an outcome of interest. In the ideal setting where treatment is assigned randomly, under certain assumptions the treatment effect is identifiable from the observable data and inference is straightforward. However, in other settings such as observational studies or randomized trials with noncompliance, the treatment effect is no longer identifiable without relying on untestable assumptions. Nonetheless, the observable data often do provide some information about the effect of treatment, that is, the parameter of interest is partially identifiable. Two approaches are often employed in this setting: (i) bounds are derived for the treatment effect under minimal assumptions, or (ii) additional untestable assumptions are invoked that render the treatment effect identifiable and then sensitivity analysis is conducted to assess how inference about the treatment effect changes as the untestable assumptions are varied. Approaches (i) and (ii) are considered in various settings, including assessing principal strata effects, direct and indirect effects and effects of time-varying exposures. Methods for drawing formal inference about partially identified parameters are also discussed. PMID:25663743

  3. Bayesian nonparametric estimation of continuous monotone functions with applications to dose-response analysis.

    Science.gov (United States)

    Bornkamp, Björn; Ickstadt, Katja

    2009-03-01

    In this article, we consider monotone nonparametric regression in a Bayesian framework. The monotone function is modeled as a mixture of shifted and scaled parametric probability distribution functions, and a general random probability measure is assumed as the prior for the mixing distribution. We investigate the choice of the underlying parametric distribution function and find that the two-sided power distribution function is well suited both from a computational and mathematical point of view. The model is motivated by traditional nonlinear models for dose-response analysis, and provides possibilities to elicitate informative prior distributions on different aspects of the curve. The method is compared with other recent approaches to monotone nonparametric regression in a simulation study and is illustrated on a data set from dose-response analysis.

  4. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    Science.gov (United States)

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two

  5. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  6. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  7. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  8. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    Science.gov (United States)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  9. Nonparametric statistics with applications to science and engineering

    CERN Document Server

    Kvam, Paul H

    2007-01-01

    A thorough and definitive book that fully addresses traditional and modern-day topics of nonparametric statistics This book presents a practical approach to nonparametric statistical analysis and provides comprehensive coverage of both established and newly developed methods. With the use of MATLAB, the authors present information on theorems and rank tests in an applied fashion, with an emphasis on modern methods in regression and curve fitting, bootstrap confidence intervals, splines, wavelets, empirical likelihood, and goodness-of-fit testing. Nonparametric Statistics with Applications to Science and Engineering begins with succinct coverage of basic results for order statistics, methods of categorical data analysis, nonparametric regression, and curve fitting methods. The authors then focus on nonparametric procedures that are becoming more relevant to engineering researchers and practitioners. The important fundamental materials needed to effectively learn and apply the discussed methods are also provide...

  10. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  11. CATDAT - A program for parametric and nonparametric categorical data analysis user's manual, Version 1.0

    International Nuclear Information System (INIS)

    Peterson, James R.; Haas, Timothy C.; Lee, Danny C.

    2000-01-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network

  12. Trend Analysis of Pahang River Using Non-Parametric Analysis: Mann Kendalls Trend Test

    International Nuclear Information System (INIS)

    Nur Hishaam Sulaiman; Mohd Khairul Amri Kamarudin; Mohd Khairul Amri Kamarudin; Ahmad Dasuki Mustafa; Muhammad Azizi Amran; Fazureen Azaman; Ismail Zainal Abidin; Norsyuhada Hairoma

    2015-01-01

    Flood is common in Pahang especially during northeast monsoon season from November to February. Three river cross station: Lubuk Paku, Sg. Yap and Temerloh were selected as area of this study. The stream flow and water level data were gathered from DID record. Data set for this study were analysed by using non-parametric analysis, Mann-Kendall Trend Test. The results that obtained from stream flow and water level analysis indicate that there are positively significant trend for Lubuk Paku (0.001) and Sg. Yap (<0.0001) from 1972-2011 with the p-value < 0.05. Temerloh (0.178) data from 1963-2011 recorded no trend for stream flow parameter but negative trend for water level parameter. Hydrological pattern and trend are extremely affected by outside factors such as north east monsoon season that occurred in South China Sea and affected Pahang during November to March. There are other factors such as development and management of the areas which can be considered as factors affected the data and results. Hydrological Pattern is important to indicate the river trend such as stream flow and water level. It can be used as flood mitigation by local authorities. (author)

  13. Comparative analysis of automotive paints by laser induced breakdown spectroscopy and nonparametric permutation tests

    International Nuclear Information System (INIS)

    McIntee, Erin; Viglino, Emilie; Rinke, Caitlin; Kumor, Stephanie; Ni Liqiang; Sigman, Michael E.

    2010-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been investigated for the discrimination of automobile paint samples. Paint samples from automobiles of different makes, models, and years were collected and separated into sets based on the color, presence or absence of effect pigments and the number of paint layers. Twelve LIBS spectra were obtained for each paint sample, each an average of a five single shot 'drill down' spectra from consecutive laser ablations in the same spot on the sample. Analyses by a nonparametric permutation test and a parametric Wald test were performed to determine the extent of discrimination within each set of paint samples. The discrimination power and Type I error were assessed for each data analysis method. Conversion of the spectral intensity to a log-scale (base 10) resulted in a higher overall discrimination power while observing the same significance level. Working on the log-scale, the nonparametric permutation tests gave an overall 89.83% discrimination power with a size of Type I error being 4.44% at the nominal significance level of 5%. White paint samples, as a group, were the most difficult to differentiate with the power being only 86.56% followed by 95.83% for black paint samples. Parametric analysis of the data set produced lower discrimination (85.17%) with 3.33% Type I errors, which is not recommended for both theoretical and practical considerations. The nonparametric testing method is applicable across many analytical comparisons, with the specific application described here being the pairwise comparison of automotive paint samples.

  14. Nonparametric Transfer Function Models

    Science.gov (United States)

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  15. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  16. Genomic outlier profile analysis: mixture models, null hypotheses, and nonparametric estimation.

    Science.gov (United States)

    Ghosh, Debashis; Chinnaiyan, Arul M

    2009-01-01

    In most analyses of large-scale genomic data sets, differential expression analysis is typically assessed by testing for differences in the mean of the distributions between 2 groups. A recent finding by Tomlins and others (2005) is of a different type of pattern of differential expression in which a fraction of samples in one group have overexpression relative to samples in the other group. In this work, we describe a general mixture model framework for the assessment of this type of expression, called outlier profile analysis. We start by considering the single-gene situation and establishing results on identifiability. We propose 2 nonparametric estimation procedures that have natural links to familiar multiple testing procedures. We then develop multivariate extensions of this methodology to handle genome-wide measurements. The proposed methodologies are compared using simulation studies as well as data from a prostate cancer gene expression study.

  17. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Takamizawa, Hisashi, E-mail: takamizawa.hisashi@jaea.go.jp; Itoh, Hiroto, E-mail: ito.hiroto@jaea.go.jp; Nishiyama, Yutaka, E-mail: nishiyama.yutaka93@jaea.go.jp

    2016-10-15

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  18. Performances of non-parametric statistics in sensitivity analysis and parameter ranking

    International Nuclear Information System (INIS)

    Saltelli, A.

    1987-01-01

    Twelve parametric and non-parametric sensitivity analysis techniques are compared in the case of non-linear model responses. The test models used are taken from the long-term risk analysis for the disposal of high level radioactive waste in a geological formation. They describe the transport of radionuclides through a set of engineered and natural barriers from the repository to the biosphere and to man. The output data from these models are the dose rates affecting the maximum exposed individual of a critical group at a given point in time. All the techniques are applied to the output from the same Monte Carlo simulations, where a modified version of Latin Hypercube method is used for the sample selection. Hypothesis testing is systematically applied to quantify the degree of confidence in the results given by the various sensitivity estimators. The estimators are ranked according to their robustness and stability, on the basis of two test cases. The conclusions are that no estimator can be considered the best from all points of view and recommend the use of more than just one estimator in sensitivity analysis

  19. Single molecule force spectroscopy at high data acquisition: A Bayesian nonparametric analysis

    Science.gov (United States)

    Sgouralis, Ioannis; Whitmore, Miles; Lapidus, Lisa; Comstock, Matthew J.; Pressé, Steve

    2018-03-01

    Bayesian nonparametrics (BNPs) are poised to have a deep impact in the analysis of single molecule data as they provide posterior probabilities over entire models consistent with the supplied data, not just model parameters of one preferred model. Thus they provide an elegant and rigorous solution to the difficult problem encountered when selecting an appropriate candidate model. Nevertheless, BNPs' flexibility to learn models and their associated parameters from experimental data is a double-edged sword. Most importantly, BNPs are prone to increasing the complexity of the estimated models due to artifactual features present in time traces. Thus, because of experimental challenges unique to single molecule methods, naive application of available BNP tools is not possible. Here we consider traces with time correlations and, as a specific example, we deal with force spectroscopy traces collected at high acquisition rates. While high acquisition rates are required in order to capture dwells in short-lived molecular states, in this setup, a slow response of the optical trap instrumentation (i.e., trapped beads, ambient fluid, and tethering handles) distorts the molecular signals introducing time correlations into the data that may be misinterpreted as true states by naive BNPs. Our adaptation of BNP tools explicitly takes into consideration these response dynamics, in addition to drift and noise, and makes unsupervised time series analysis of correlated single molecule force spectroscopy measurements possible, even at acquisition rates similar to or below the trap's response times.

  20. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  1. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  2. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  3. A Bayesian approach to the analysis of quantal bioassay studies using nonparametric mixture models.

    Science.gov (United States)

    Fronczyk, Kassandra; Kottas, Athanasios

    2014-03-01

    We develop a Bayesian nonparametric mixture modeling framework for quantal bioassay settings. The approach is built upon modeling dose-dependent response distributions. We adopt a structured nonparametric prior mixture model, which induces a monotonicity restriction for the dose-response curve. Particular emphasis is placed on the key risk assessment goal of calibration for the dose level that corresponds to a specified response. The proposed methodology yields flexible inference for the dose-response relationship as well as for other inferential objectives, as illustrated with two data sets from the literature. © 2013, The International Biometric Society.

  4. Does Private Tutoring Work? The Effectiveness of Private Tutoring: A Nonparametric Bounds Analysis

    Science.gov (United States)

    Hof, Stefanie

    2014-01-01

    Private tutoring has become popular throughout the world. However, evidence for the effect of private tutoring on students' academic outcome is inconclusive; therefore, this paper presents an alternative framework: a nonparametric bounds method. The present examination uses, for the first time, a large representative data-set in a European setting…

  5. Non-Parametric Kinetic (NPK Analysis of Thermal Oxidation of Carbon Aerogels

    Directory of Open Access Journals (Sweden)

    Azadeh Seifi

    2017-05-01

    Full Text Available In recent years, much attention has been paid to aerogel materials (especially carbon aerogels due to their potential uses in energy-related applications, such as thermal energy storage and thermal protection systems. These open cell carbon-based porous materials (carbon aerogels can strongly react with oxygen at relatively low temperatures (~ 400°C. Therefore, it is necessary to evaluate the thermal performance of carbon aerogels in view of their energy-related applications at high temperatures and under thermal oxidation conditions. The objective of this paper is to study theoretically and experimentally the oxidation reaction kinetics of carbon aerogel using the non-parametric kinetic (NPK as a powerful method. For this purpose, a non-isothermal thermogravimetric analysis, at three different heating rates, was performed on three samples each with its specific pore structure, density and specific surface area. The most significant feature of this method, in comparison with the model-free isoconversional methods, is its ability to separate the functionality of the reaction rate with the degree of conversion and temperature by the direct use of thermogravimetric data. Using this method, it was observed that the Nomen-Sempere model could provide the best fit to the data, while the temperature dependence of the rate constant was best explained by a Vogel-Fulcher relationship, where the reference temperature was the onset temperature of oxidation. Moreover, it was found from the results of this work that the assumption of the Arrhenius relation for the temperature dependence of the rate constant led to over-estimation of the apparent activation energy (up to 160 kJ/mol that was considerably different from the values (up to 3.5 kJ/mol predicted by the Vogel-Fulcher relationship in isoconversional methods

  6. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  7. A non-parametric Data Envelopment Analysis approach for improving energy efficiency of grape production

    International Nuclear Information System (INIS)

    Khoshroo, Alireza; Mulwa, Richard; Emrouznejad, Ali; Arabi, Behrouz

    2013-01-01

    Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production. In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming. The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. - Highlights: • The focus of this paper is to identify excessive use of energy and optimize energy consumption in grape production. • We measure the efficiency as a function of labor/machinery/chemicals/farmyard manure/diesel-fuel/electricity/water. • Data were obtained from 41 grape

  8. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  9. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  10. Data analysis and approximate models model choice, location-scale, analysis of variance, nonparametric regression and image analysis

    CERN Document Server

    Davies, Patrick Laurie

    2014-01-01

    Introduction IntroductionApproximate Models Notation Two Modes of Statistical AnalysisTowards One Mode of Analysis Approximation, Randomness, Chaos, Determinism ApproximationA Concept of Approximation Approximation Approximating a Data Set by a Model Approximation Regions Functionals and EquivarianceRegularization and Optimality Metrics and DiscrepanciesStrong and Weak Topologies On Being (almost) Honest Simulations and Tables Degree of Approximation and p-values ScalesStability of Analysis The Choice of En(α, P) Independence Procedures, Approximation and VaguenessDiscrete Models The Empirical Density Metrics and Discrepancies The Total Variation Metric The Kullback-Leibler and Chi-Squared Discrepancies The Po(λ) ModelThe b(k, p) and nb(k, p) Models The Flying Bomb Data The Student Study Times Data OutliersOutliers, Data Analysis and Models Breakdown Points and Equivariance Identifying Outliers and Breakdown Outliers in Multivariate Data Outliers in Linear Regression Outliers in Structured Data The Location...

  11. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  12. The Italian Footwear Industry: an Empirical Analysis

    OpenAIRE

    Pirolo, Luca; Giustiniano, Luca; Nenni, Maria Elena

    2013-01-01

    This paper aims to provide readers with a deep empirical analysis on the Italian footwear industry in order to investigate the evolution of its structure (trends in sales and production, number of firms and employees, main markets, etc.), together with the identification of the main drivers of competitiveness in order to explain the strategies implemented by local actors.

  13. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  14. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  15. Industrial energy efficiency with CO2 emissions in China: A nonparametric analysis

    International Nuclear Information System (INIS)

    Wu, F.; Fan, L.W.; Zhou, P.; Zhou, D.Q.

    2012-01-01

    Global awareness on energy security and climate change has created much interest in assessing economy-wide energy efficiency performance. A number of previous studies have contributed to evaluate energy efficiency performance using different analytical techniques among which data envelopment analysis (DEA) has recently received increasing attention. Most of DEA-related energy efficiency studies do not consider undesirable outputs such as CO 2 emissions in their modeling framework, which may lead to biased energy efficiency values. Within a joint production framework of desirable and undesirable outputs, in this paper we construct both static and dynamic energy efficiency performance indexes for measuring industrial energy efficiency performance by using several environmental DEA models with CO 2 emissions. The dynamic energy efficiency performance indexes have further been decomposed into two contributing components. We finally apply the indexes proposed to assess the industrial energy efficiency performance of different provinces in China over time. Our empirical study shows that the energy efficiency improvement in China's industrial sector was mainly driven by technological improvement. - Highlights: ► China's industrial energy efficiency is evaluated by DEA models with CO 2 emissions. ► China's industrial energy efficiency improved by 5.6% annually since 1997. ► Industrial energy efficiency improvement in China was mainly driven by technological improvement.

  16. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  17. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  18. Empirical analysis of uranium spot prices

    International Nuclear Information System (INIS)

    Morman, M.R.

    1988-01-01

    The objective is to empirically test a market model of the uranium industry that incorporates the notion that, if the resource is viewed as an asset by economic agents, then its own rate of return along with the own rate of return of a competing asset would be a major factor in formulating the price of the resource. The model tested is based on a market model of supply and demand. The supply model incorporates the notion that the decision criteria used by uranium mine owners is to select that extraction rate that maximizes the net present value of their extraction receipts. The demand model uses a concept that allows for explicit recognition of the prospect of arbitrage between a natural-resource market and the market for other capital goods. The empirical approach used for estimation was a recursive or causal model. The empirical results were consistent with the theoretical models. The coefficients of the demand and supply equations had the appropriate signs. Tests for causality were conducted to validate the use of the causal model. The results obtained were favorable. The implication of the findings as related to future studies of exhaustible resources are: (1) in some cases causal models are the appropriate specification for empirical analysis; (2) supply models should incorporate a measure to capture depletion effects

  19. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  20. Bayesian Nonparametric Hidden Markov Models with application to the analysis of copy-number-variation in mammalian genomes.

    Science.gov (United States)

    Yau, C; Papaspiliopoulos, O; Roberts, G O; Holmes, C

    2011-01-01

    We consider the development of Bayesian Nonparametric methods for product partition models such as Hidden Markov Models and change point models. Our approach uses a Mixture of Dirichlet Process (MDP) model for the unknown sampling distribution (likelihood) for the observations arising in each state and a computationally efficient data augmentation scheme to aid inference. The method uses novel MCMC methodology which combines recent retrospective sampling methods with the use of slice sampler variables. The methodology is computationally efficient, both in terms of MCMC mixing properties, and robustness to the length of the time series being investigated. Moreover, the method is easy to implement requiring little or no user-interaction. We apply our methodology to the analysis of genomic copy number variation.

  1. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  2. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural ...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  3. Unemployment and Mental Disorders - An Empirical Analysis

    DEFF Research Database (Denmark)

    Agerbo, Esben; Eriksson, Tor Viking; Mortensen, Preben Bo

    1998-01-01

    The purpose of this paper is also to analyze the importance of unemployment and other social factors as risk factors for impaired mental health. It departs from previous studies in that we make use of information about first admissions to a psychiatric hospital or ward as our measure of mental...... from the Psychiatric case register. Secondly, we estimate conditional logistic regression models for case-control data on first admissions to a psychiatric hospital. The explanatory variables in the empirical analysis include age, gender, education, marital status, income, wealth, and unemployment (and...

  4. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  5. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  6. An Empirical Analysis of the Budget Deficit

    Directory of Open Access Journals (Sweden)

    Ioan Talpos

    2007-11-01

    Full Text Available Economic policies and, particularly, fiscal policies are not designed and implemented in an “empty space”: the structural characteristics of the economic systems, the institutional architecture of societies, the cultural paradigm and the power relations between different social groups define the borders of these policies. This paper tries to deal with these borders, to describe their nature and the implications of their existence to the fiscal policies’ quality and impact at a theoretical level as well as at an empirical one. The main results of the proposed analysis support the ideas that the mentioned variables matters both for the social mandate entrusted by the society to the state and thus to the role and functions of the state and for the economic growth as a support of the resources collected at distributed by the public authorities.

  7. Hybrid elementary flux analysis/nonparametric modeling: application for bioprocess control

    Directory of Open Access Journals (Sweden)

    Alves Paula M

    2007-01-01

    Full Text Available Abstract Background The progress in the "-omic" sciences has allowed a deeper knowledge on many biological systems with industrial interest. This knowledge is still rarely used for advanced bioprocess monitoring and control at the bioreactor level. In this work, a bioprocess control method is presented, which is designed on the basis of the metabolic network of the organism under consideration. The bioprocess dynamics are formulated using hybrid rigorous/data driven systems and its inherent structure is defined by the metabolism elementary modes. Results The metabolic network of the system under study is decomposed into elementary modes (EMs, which are the simplest paths able to operate coherently in steady-state. A reduced reaction mechanism in the form of simplified reactions connecting substrates with end-products is obtained. A dynamical hybrid system integrating material balance equations, EMs reactions stoichiometry and kinetics was formulated. EMs kinetics were defined as the product of two terms: a mechanistic/empirical known term and an unknown term that must be identified from data, in a process optimisation perspective. This approach allows the quantification of fluxes carried by individual elementary modes which is of great help to identify dominant pathways as a function of environmental conditions. The methodology was employed to analyse experimental data of recombinant Baby Hamster Kidney (BHK-21A cultures producing a recombinant fusion glycoprotein. The identified EMs kinetics demonstrated typical glucose and glutamine metabolic responses during cell growth and IgG1-IL2 synthesis. Finally, an online optimisation study was conducted in which the optimal feeding strategies of glucose and glutamine were calculated after re-estimation of model parameters at each sampling time. An improvement in the final product concentration was obtained as a result of this online optimisation. Conclusion The main contribution of this work is a

  8. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  9. Implementation and evaluation of nonparametric regression procedures for sensitivity analysis of computationally demanding models

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Swiler, Laura P.; Helton, Jon C.; Sallaberry, Cedric J.

    2009-01-01

    The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.

  10. Non-parametric analysis of technical efficiency: factors affecting efficiency of West Java rice farms

    Czech Academy of Sciences Publication Activity Database

    Brázdik, František

    -, č. 286 (2006), s. 1-45 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : rice farms * data envelopment analysis Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp286.pdf

  11. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Measuring the efficiency of dental departments in medical centers: a nonparametric analysis approach.

    Science.gov (United States)

    Wang, Su-Chen; Tsai, Chi-Cheng; Huang, Shun-Te; Hong, Yu-Jue

    2002-12-01

    Data envelopment analysis (DEA), a cross-sectional study design based on secondary data analysis, was used to evaluate the relative operational efficiency of 16 dental departments in medical centers in Taiwan in 1999. The results indicated that 68.7% of all dental departments in medical centers had poor performance in terms of overall efficiency and scale efficiency. All relatively efficient dental departments were in private medical centers. Half of these dental departments were unable to fully utilize available medical resources. 75.0% of public medical centers did not take full advantage of medical resources at their disposal. In the returns to scale, 56.3% of dental departments in medical centers exhibited increasing returns to scale, due to the insufficient scale influencing overall hospital operational efficiency. Public medical centers accounted for 77.8% of the institutions affected. The scale of dental departments in private medical centers was more appropriate than those in public medical centers. In the sensitivity analysis, the numbers of residents, interns, and published papers were used to assess teaching and research. Greater emphasis on teaching and research in medical centers has a large effect on the relative inefficiency of hospital operation. Dental departments in private medical centers had a higher mean overall efficiency score than those in public medical centers, and the overall efficiency of dental departments in non-university hospitals was greater than those in university hospitals. There was no information to evaluate the long-term efficiency of each dental department in all hospitals. A different combination of input and output variables, using common multipliers for efficiency value measurements in DEA, may help establish different pioneering dental departments in hospitals.

  13. Non-parametric trend analysis of the aridity index for three large arid and semi-arid basins in Iran

    Science.gov (United States)

    Ahani, Hossien; Kherad, Mehrzad; Kousari, Mohammad Reza; van Roosmalen, Lieke; Aryanfar, Ramin; Hosseini, Seyyed Mashaallah

    2013-05-01

    Currently, an important scientific challenge that researchers are facing is to gain a better understanding of climate change at the regional scale, which can be especially challenging in an area with low and highly variable precipitation amounts such as Iran. Trend analysis of the medium-term change using ground station observations of meteorological variables can enhance our knowledge of the dominant processes in an area and contribute to the analysis of future climate projections. Generally, studies focus on the long-term variability of temperature and precipitation and to a lesser extent on other important parameters such as moisture indices. In this study the recent 50-year trends (1955-2005) of precipitation (P), potential evapotranspiration (PET), and aridity index (AI) in monthly time scale were studied over 14 synoptic stations in three large Iran basins using the Mann-Kendall non-parametric test. Additionally, an analysis of the monthly, seasonal and annual trend of each parameter was performed. Results showed no significant trends in the monthly time series. However, PET showed significant, mostly decreasing trends, for the seasonal values, which resulted in a significant negative trend in annual PET at five stations. Significant negative trends in seasonal P values were only found at a number of stations in spring and summer and no station showed significant negative trends in annual P. Due to the varied positive and negative trends in annual P and to a lesser extent PET, almost as many stations with negative as positive trends in annual AI were found, indicating that both drying and wetting trends occurred in Iran. Overall, the northern part of the study area showed an increasing trend in annual AI which meant that the region became wetter, while the south showed decreasing trends in AI.

  14. Adaptive Kernel Canonical Correlation Analysis Algorithms for Nonparametric Identification of Wiener and Hammerstein Systems

    Directory of Open Access Journals (Sweden)

    Ignacio Santamaría

    2008-04-01

    Full Text Available This paper treats the identification of nonlinear systems that consist of a cascade of a linear channel and a nonlinearity, such as the well-known Wiener and Hammerstein systems. In particular, we follow a supervised identification approach that simultaneously identifies both parts of the nonlinear system. Given the correct restrictions on the identification problem, we show how kernel canonical correlation analysis (KCCA emerges as the logical solution to this problem. We then extend the proposed identification algorithm to an adaptive version allowing to deal with time-varying systems. In order to avoid overfitting problems, we discuss and compare three possible regularization techniques for both the batch and the adaptive versions of the proposed algorithm. Simulations are included to demonstrate the effectiveness of the presented algorithm.

  15. A Parcellation Based Nonparametric Algorithm for Independent Component Analysis with Application to fMRI Data

    Directory of Open Access Journals (Sweden)

    Shanshan eLi

    2016-01-01

    Full Text Available Independent Component analysis (ICA is a widely used technique for separating signals that have been mixed together. In this manuscript, we propose a novel ICA algorithm using density estimation and maximum likelihood, where the densities of the signals are estimated via p-spline based histogram smoothing and the mixing matrix is simultaneously estimated using an optimization algorithm. The algorithm is exceedingly simple, easy to implement and blind to the underlying distributions of the source signals. To relax the identically distributed assumption in the density function, a modified algorithm is proposed to allow for different density functions on different regions. The performance of the proposed algorithm is evaluated in different simulation settings. For illustration, the algorithm is applied to a research investigation with a large collection of resting state fMRI datasets. The results show that the algorithm successfully recovers the established brain networks.

  16. A critique of non-parametric efficiency analysis in energy economics studies

    International Nuclear Information System (INIS)

    Chen, Chien-Ming

    2013-01-01

    The paper reexamines non-additive environmental efficiency models with weakly-disposable undesirable outputs appeared in the literature of energy economics. These efficiency models are used in numerous studies published in this journal and other energy-related outlets. Recent studies, however, have found key limitations of the weak-disposability assumption in its application to environmental efficiency analysis. It is found that efficiency scores obtained from non-additive efficiency models can be non-monotonic in pollution quantities under the weak-disposability assumption — which is against common intuition and the principle of environmental economics. In this paper, I present taxonomy of efficiency models found in the energy economics literature and illustrate the above limitations and discuss implications of monotonicity from a practical viewpoint. Finally, I review the formulations for a variable returns-to-scale technology with weakly-disposable undesirable outputs, which has been misused in a number of papers in the energy economics literature. An application to evaluating the energy efficiencies of 23 European Union states is presented to illustrate the problem. - Highlights: • Review different environmental efficiency model used in energy economics studies • Highlight limitations of these environmental efficiency models • These limitations have not been recognized in the existing energy economics literature. • Data from 23 European Union states are used to illustrate the methodological consequences

  17. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  18. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    International Nuclear Information System (INIS)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David; Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Buchhave, Lars A.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  19. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C. [Astronomy Department, University of Florida, 211 Bryant Space Sciences Center, Gainesville, FL 32611 (United States); Fabrycky, Daniel C. [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Steffen, Jason H. [Fermilab Center for Particle Astrophysics, P.O. Box 500, MS 127, Batavia, IL 60510 (United States); Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Welsh, William F. [Astronomy Department, San Diego State University, San Diego, CA 92182-1221 (United States); Allen, Christopher [Orbital Sciences Corporation/NASA Ames Research Center, Moffett Field, CA 94035 (United States); Batalha, Natalie M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192 (United States); Buchhave, Lars A., E-mail: eford@astro.ufl.edu [Niels Bohr Institute, Copenhagen University, DK-2100 Copenhagen (Denmark); Collaboration: Kepler Science Team; and others

    2012-05-10

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  20. Proposing a framework for airline service quality evaluation using Type-2 Fuzzy TOPSIS and non-parametric analysis

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper focuses on evaluating airline service quality from the perspective of passengers' view. Until now a lot of researches has been performed in airline service quality evaluation in the world but a little research has been conducted in Iran, yet. In this study, a framework for measuring airline service quality in Iran is proposed. After reviewing airline service quality criteria, SSQAI model was selected because of its comprehensiveness in covering airline service quality dimensions. SSQAI questionnaire items were redesigned to adopt with Iranian airlines requirements and environmental circumstances in the Iran's economic and cultural context. This study includes fuzzy decision-making theory, considering the possible fuzzy subjective judgment of the evaluators during airline service quality evaluation. Fuzzy TOPSIS have been applied for ranking airlines service quality performances. Three major Iranian airlines which have the most passenger transfer volumes in domestic and foreign flights were chosen for evaluation in this research. Results demonstrated Mahan airline has got the best service quality performance rank in gaining passengers' satisfaction with delivery of high-quality services to its passengers, among the three major Iranian airlines. IranAir and Aseman airlines placed in the second and third rank, respectively, according to passenger's evaluation. Statistical analysis has been used in analyzing passenger responses. Due to the abnormality of data, Non-parametric tests were applied. To demonstrate airline ranks in every criterion separately, Friedman test was performed. Variance analysis and Tukey test were applied to study the influence of increasing in age and educational level of passengers on degree of their satisfaction from airline's service quality. Results showed that age has no significant relation to passenger satisfaction of airlines, however, increasing in educational level demonstrated a negative impact on

  1. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  2. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. On Cooper's Nonparametric Test.

    Science.gov (United States)

    Schmeidler, James

    1978-01-01

    The basic assumption of Cooper's nonparametric test for trend (EJ 125 069) is questioned. It is contended that the proper assumption alters the distribution of the statistic and reduces its usefulness. (JKS)

  4. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  5. An empirical analysis of Diaspora bonds

    OpenAIRE

    AKKOYUNLU, Şule; STERN, Max

    2018-01-01

    Abstract. This study is the first to investigate theoretically and empirically the determinants of Diaspora Bonds for eight developing countries (Bangladesh, Ethiopia, Ghana, India, Lebanon, Pakistan, the Philippines, and Sri-Lanka) and one developed country - Israel for the period 1951 and 2008. Empirical results are consistent with the predictions of the theoretical model. The most robust variables are the closeness indicator and the sovereign rating, both on the demand-side. The spread is ...

  6. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  7. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  8. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  9. Robust variable selection method for nonparametric differential equation models with application to nonlinear dynamic gene regulatory network analysis.

    Science.gov (United States)

    Lu, Tao

    2016-01-01

    The gene regulation network (GRN) evaluates the interactions between genes and look for models to describe the gene expression behavior. These models have many applications; for instance, by characterizing the gene expression mechanisms that cause certain disorders, it would be possible to target those genes to block the progress of the disease. Many biological processes are driven by nonlinear dynamic GRN. In this article, we propose a nonparametric differential equation (ODE) to model the nonlinear dynamic GRN. Specially, we address following questions simultaneously: (i) extract information from noisy time course gene expression data; (ii) model the nonlinear ODE through a nonparametric smoothing function; (iii) identify the important regulatory gene(s) through a group smoothly clipped absolute deviation (SCAD) approach; (iv) test the robustness of the model against possible shortening of experimental duration. We illustrate the usefulness of the model and associated statistical methods through a simulation and a real application examples.

  10. Insurability of Cyber Risk: An Empirical Analysis

    OpenAIRE

    Biener, Christian; Eling, Martin; Wirfs, Jan Hendrik

    2015-01-01

    This paper discusses the adequacy of insurance for managing cyber risk. To this end, we extract 994 cases of cyber losses from an operational risk database and analyse their statistical properties. Based on the empirical results and recent literature, we investigate the insurability of cyber risk by systematically reviewing the set of criteria introduced by Berliner (1982). Our findings emphasise the distinct characteristics of cyber risks compared with other operational risks and bring to li...

  11. Compassion: An Evolutionary Analysis and Empirical Review

    OpenAIRE

    Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana

    2010-01-01

    What is compassion? And how did it evolve? In this review, we integrate three evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct appraisal processes attuned to undeserved suffering, distinct signaling behavior related to caregiving patterns of touch, posture, and vocalization...

  12. Theory of nonparametric tests

    CERN Document Server

    Dickhaus, Thorsten

    2018-01-01

    This textbook provides a self-contained presentation of the main concepts and methods of nonparametric statistical testing, with a particular focus on the theoretical foundations of goodness-of-fit tests, rank tests, resampling tests, and projection tests. The substitution principle is employed as a unified approach to the nonparametric test problems discussed. In addition to mathematical theory, it also includes numerous examples and computer implementations. The book is intended for advanced undergraduate, graduate, and postdoc students as well as young researchers. Readers should be familiar with the basic concepts of mathematical statistics typically covered in introductory statistics courses.

  13. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  14. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  15. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  16. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    capture the behavior of observed phenomena. Higher-order polynomial and finite-dimensional spline basis models allow for more complicated responses as the...flexibility as these are nonparametric (not constrained to any particular functional form). These should be useful in identifying nonstandard behavior via... deviance ∆ = −2 log(Lreduced/Lfull) is defined in terms of the likelihood function L. For normal error, Lfull = 1, and based on Eq. A-2, we have log

  17. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  18. Government debt in Greece: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Gisele Mah

    2014-06-01

    Full Text Available Greek government debt has been increasing above the percentage stated in the growth and stability path from 112.9% in 2008 to 175.6% in 2013. This paper investigates the determinants of the general government debt in Greek by means of Vector Error Correction Model framework, Variance Decomposition and Generalized Impulse Response Function Analysis. The analysis showed a significant negative relationship between general government debt and government deficit, general government debt and inflation. Shocks to general government and inflation will cause general government debt to increase. Government deficit should be increased since there is gross capital formation included in its calculation which could be invested in income generating projects. The current account balance should be reduced by improving the net trade balance.

  19. Microscopic saw mark analysis: an empirical approach.

    Science.gov (United States)

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  20. THE LISBON STRATEGY: AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Silvestri Marcello

    2010-07-01

    Full Text Available This paper investigates the European economic integration within the frame work of the 2000 Lisbon Council with the aim of studying the dynamics affecting the social and economic life of European Countries. Such a descriptive investigation focuses on certain significant variables of the new theories highlighting the importance of technological innovation and human capital. To this end the multivariate statistic technique of Principal Component Analysis has been applied in order to classify Countries with regard to the investigated phenomenon.

  1. Some connections for manuals of empirical logic to functional analysis

    International Nuclear Information System (INIS)

    Cook, T.A.

    1981-01-01

    In this informal presentation, the theory of manuals of operations is connected with some familiar concepts in functional analysis; namely, base normed and order unit normed spaces. The purpose of this discussion is to present several general open problems which display the interplay of empirical logic with functional analysis. These are mathematical problems with direct physical interpretation. (orig./HSI)

  2. Empirical analysis of industrial operations in Montenegro

    Directory of Open Access Journals (Sweden)

    Galić Jelena

    2012-12-01

    Full Text Available Since the starting process of transition, industrial production in Montenegro has been faced with serious problems and its share in GDP is constantly decreasing. Global financial crises had in large extent negatively influenced industry. Analysis of financial indicators showed that industry had significant losses, problem of undercapitalisation and liquidity problems. If we look by industry sectors, than situation is more favourable in the production of electricity, gas and water compared to extracting industry and mining. In paper is proposed measures of economic policy in order to improve situation in industry.

  3. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  4. A Unified Discussion on the Concept of Score Functions Used in the Context of Nonparametric Linkage Analysis

    Directory of Open Access Journals (Sweden)

    Lars Ängquist

    2008-01-01

    Full Text Available In this article we try to discuss nonparametric linkage (NPL score functions within a broad and quite general framework. The main focus of the paper is the structure, derivation principles and interpretations of the score function entity itself. We define and discuss several families of one-locus score function definitions, i.e. the implicit, explicit and optimal ones. Some generalizations and comments to the two-locus, unconditional and conditional, cases are included as well. Although this article mainly aims at serving as an overview, where the concept of score functions are put into a covering context, we generalize the noncentrality parameter (NCP optimal score functions in Ängquist et al. (2007 to facilitate—through weighting—for incorporation of several plausible distinct genetic models. Since the genetic model itself most oftenly is to some extent unknown this facilitates weaker prior assumptions with respect to plausible true disease models without loosing the property of NCP-optimality. Moreover, we discuss general assumptions and properties of score functions in the above sense. For instance, the concept of identical by descent (IBD sharing structures and score function equivalence are discussed in some detail.

  5. Parametric and Nonparametric EEG Analysis for the Evaluation of EEG Activity in Young Children with Controlled Epilepsy

    Directory of Open Access Journals (Sweden)

    Vangelis Sakkalis

    2008-01-01

    Full Text Available There is an important evidence of differences in the EEG frequency spectrum of control subjects as compared to epileptic subjects. In particular, the study of children presents difficulties due to the early stages of brain development and the various forms of epilepsy indications. In this study, we consider children that developed epileptic crises in the past but without any other clinical, psychological, or visible neurophysiological findings. The aim of the paper is to develop reliable techniques for testing if such controlled epilepsy induces related spectral differences in the EEG. Spectral features extracted by using nonparametric, signal representation techniques (Fourier and wavelet transform and a parametric, signal modeling technique (ARMA are compared and their effect on the classification of the two groups is analyzed. The subjects performed two different tasks: a control (rest task and a relatively difficult math task. The results show that spectral features extracted by modeling the EEG signals recorded from individual channels by an ARMA model give a higher discrimination between the two subject groups for the control task, where classification scores of up to 100% were obtained with a linear discriminant classifier.

  6. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  7. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  8. An Empirical Analysis of the Relationship between Minimum Wage ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Relationship between Minimum Wage, Investment and Economic Growth in Ghana. ... In addition, the ratio of public investment to tax revenue must increase as minimum wage increases since such complementary changes are more likely to lead to economic growth. Keywords: minimum wage ...

  9. Energy-saving and emission-abatement potential of Chinese coal-fired power enterprise: A non-parametric analysis

    International Nuclear Information System (INIS)

    Wei, Chu; Löschel, Andreas; Liu, Bing

    2015-01-01

    In the context of soaring demand for electricity, mitigating and controlling greenhouse gas emissions is a great challenge for China's power sector. Increasing attention has been placed on the evaluation of energy efficiency and CO 2 abatement potential in the power sector. However, studies at the micro-level are relatively rare due to serious data limitations. This study uses the 2004 and 2008 Census data of Zhejiang province to construct a non-parametric frontier in order to assess the abatement space of energy and associated CO 2 emission from China's coal-fired power enterprises. A Weighted Russell Directional Distance Function (WRDDF) is applied to construct an energy-saving potential index and a CO 2 emission-abatement potential index. Both indicators depict the inefficiency level in terms of energy utilization and CO 2 emissions of electric power plants. Our results show a substantial variation of energy-saving potential and CO 2 abatement potential among enterprises. We find that large power enterprises are less efficient in 2004, but become more efficient than smaller enterprises in 2008. State-owned enterprises (SOE) are not significantly different in 2008 from 2004, but perform better than their non-SOE counterparts in 2008. This change in performance for large enterprises and SOE might be driven by the “top-1000 Enterprise Energy Conservation Action” that was implemented in 2006. - Highlights: • Energy-saving potential and CO 2 abatement-potential for Chinese power enterprise are evaluated. • The potential to curb energy and emission shows great variation and dynamic changes. • Large enterprise is less efficient than small enterprise in 2004, but more efficient in 2008. • The state-owned enterprise performs better than non-state-owned enterprise in 2008

  10. The 12-item World Health Organization Disability Assessment Schedule II (WHO-DAS II: a nonparametric item response analysis

    Directory of Open Access Journals (Sweden)

    Fernandez Ana

    2010-05-01

    Full Text Available Abstract Background Previous studies have analyzed the psychometric properties of the World Health Organization Disability Assessment Schedule II (WHO-DAS II using classical omnibus measures of scale quality. These analyses are sample dependent and do not model item responses as a function of the underlying trait level. The main objective of this study was to examine the effectiveness of the WHO-DAS II items and their options in discriminating between changes in the underlying disability level by means of item response analyses. We also explored differential item functioning (DIF in men and women. Methods The participants were 3615 adult general practice patients from 17 regions of Spain, with a first diagnosed major depressive episode. The 12-item WHO-DAS II was administered by the general practitioners during the consultation. We used a non-parametric item response method (Kernel-Smoothing implemented with the TestGraf software to examine the effectiveness of each item (item characteristic curves and their options (option characteristic curves in discriminating between changes in the underliying disability level. We examined composite DIF to know whether women had a higher probability than men of endorsing each item. Results Item response analyses indicated that the twelve items forming the WHO-DAS II perform very well. All items were determined to provide good discrimination across varying standardized levels of the trait. The items also had option characteristic curves that showed good discrimination, given that each increasing option became more likely than the previous as a function of increasing trait level. No gender-related DIF was found on any of the items. Conclusions All WHO-DAS II items were very good at assessing overall disability. Our results supported the appropriateness of the weights assigned to response option categories and showed an absence of gender differences in item functioning.

  11. Nonparametric combinatorial sequence models.

    Science.gov (United States)

    Wauthier, Fabian L; Jordan, Michael I; Jojic, Nebojsa

    2011-11-01

    This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This article presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three biological sequence families which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution over sequence representations induced by the prior. By integrating out the posterior, our method compares favorably to leading binding predictors.

  12. A Structural Labor Supply Model with Nonparametric Preferences

    NARCIS (Netherlands)

    van Soest, A.H.O.; Das, J.W.M.; Gong, X.

    2000-01-01

    Nonparametric techniques are usually seen as a statistic device for data description and exploration, and not as a tool for estimating models with a richer economic structure, which are often required for policy analysis.This paper presents an example where nonparametric flexibility can be attained

  13. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  14. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  15. A Bayesian nonparametric estimation of distributions and quantiles

    International Nuclear Information System (INIS)

    Poern, K.

    1988-11-01

    The report describes a Bayesian, nonparametric method for the estimation of a distribution function and its quantiles. The method, presupposing random sampling, is nonparametric, so the user has to specify a prior distribution on a space of distributions (and not on a parameter space). In the current application, where the method is used to estimate the uncertainty of a parametric calculational model, the Dirichlet prior distribution is to a large extent determined by the first batch of Monte Carlo-realizations. In this case the results of the estimation technique is very similar to the conventional empirical distribution function. The resulting posterior distribution is also Dirichlet, and thus facilitates the determination of probability (confidence) intervals at any given point in the space of interest. Another advantage is that also the posterior distribution of a specified quantitle can be derived and utilized to determine a probability interval for that quantile. The method was devised for use in the PROPER code package for uncertainty and sensitivity analysis. (orig.)

  16. Nonparametric e-Mixture Estimation.

    Science.gov (United States)

    Takano, Ken; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2016-12-01

    This study considers the common situation in data analysis when there are few observations of the distribution of interest or the target distribution, while abundant observations are available from auxiliary distributions. In this situation, it is natural to compensate for the lack of data from the target distribution by using data sets from these auxiliary distributions-in other words, approximating the target distribution in a subspace spanned by a set of auxiliary distributions. Mixture modeling is one of the simplest ways to integrate information from the target and auxiliary distributions in order to express the target distribution as accurately as possible. There are two typical mixtures in the context of information geometry: the [Formula: see text]- and [Formula: see text]-mixtures. The [Formula: see text]-mixture is applied in a variety of research fields because of the presence of the well-known expectation-maximazation algorithm for parameter estimation, whereas the [Formula: see text]-mixture is rarely used because of its difficulty of estimation, particularly for nonparametric models. The [Formula: see text]-mixture, however, is a well-tempered distribution that satisfies the principle of maximum entropy. To model a target distribution with scarce observations accurately, this letter proposes a novel framework for a nonparametric modeling of the [Formula: see text]-mixture and a geometrically inspired estimation algorithm. As numerical examples of the proposed framework, a transfer learning setup is considered. The experimental results show that this framework works well for three types of synthetic data sets, as well as an EEG real-world data set.

  17. Victim countries of transnational terrorism: an empirical characteristics analysis.

    Science.gov (United States)

    Elbakidze, Levan; Jin, Yanhong

    2012-12-01

    This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.

  18. Islamic banks and profitability: an empirical analysis of Indonesian banking

    OpenAIRE

    Jordan, Sarah

    2013-01-01

    This paper provides an empirical analysis of the factors that determine the profitability of Indonesian banks between the years 2006-2012. In particular, it investigates whether there are any significant differences in terms of profitability between Islamic banks and commercial banks. The results, obtained by applying the system-GMM estimator to the panel of 54 banks, indicate that the high bank profitability during these years were determined mainly by the size of the banks, the market share...

  19. Explaining Innovation. An Empirical Analysis of Industry Data from Norway

    Directory of Open Access Journals (Sweden)

    Torbjørn Lorentzen

    2016-01-01

    Full Text Available The objective of the paper is to analyse why some firms innovate while others do not. The paper combines different theories of innovation by relating innovation to internal, firm specific assets and external, regional factors. Hypotheses are derived from theories and tested empirically by using logistic regression. The empirical analysis indicates that internal funding of R&D and size of the firm are the most important firm specific attributes for successful innovation. External, regional factors are also important. The analysis shows that firms located in large urban regions have significantly higher innovation rates than firms located in the periphery, and firms involved in regional networking are more likely to innovate compared to firms not involved in networking. The analysis contributes to a theoretical and empirical understanding of factors that influence on innovation and the role innovation plays in the market economy. Innovation policy should be targeted at developing a tax system and building infrastructure which give firms incentives to invest and allocate internal resources to R&D-activities and collaborate with others in innovation. From an economic policy perspective, consideration should be given to allocating more public resources to rural areas in order to compensate for the asymmetric distribution of resources between the centre and periphery. The paper contributes to the scientific literature of innovation by combining the firm oriented perspective with weight on firm specific, internal resources and a system perspective which focuses on external resources and networking as the most important determinants of innovation in firms.

  20. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  1. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  2. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo; Genton, Marc G.

    2013-01-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric

  4. An isoeffect approach to the study of combined effects of mixed radiations--the nonparametric analysis of in vivo data

    International Nuclear Information System (INIS)

    Lam, G.K.

    1989-01-01

    The combined effects of mixed radiations can be examined using a system of simple isoeffect relations which are derived from a recent analysis of in vitro results obtained for a variety of radiation mixtures. Similar isoeffect analysis methods have been used for over two decades in studies of the combined action of toxic agents such as drugs and antibiotics. Because of the isoeffect approach, the method is particularly useful for the analysis of ordinal data for which conventional models that are based on parametric dose-effect relations may not be suitable. This is illustrated by applying the method to the analysis of a set of recently published in vivo data using the mouse foot skin reaction system for mixtures of neutrons and X rays. The good agreement between this method and the ordinal data also helps to provide further experimental support for the existence of a class of radiobiological data for which the simple isoeffect relations are valid

  5. EMPIRICAL ANALYSIS OF REMITTANCE INFLOW: THE CASE OF NEPAL

    Directory of Open Access Journals (Sweden)

    Karan Singh Thagunna

    2013-01-01

    Full Text Available This paper analyzes the nine year remittance inflow and macroeconomic data of Nepal, and studies the effect of remittance on each of those macroeconomic variables. We have used Unit Root Test, Least Squared Regression Analysis, and Granger Causality Test. The empirical results suggest that remittance has more causality on the consumption pattern as well as the import patter, and less on investments. Furthermore, with available literatures, this paper discusses the importance of channeling the remittance funds into the productive capital, mainly the public infrastructure, in comparison with the South Korean case study.

  6. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  7. The impact of ICT on educational performance and its efficiency in selected EU and OECD countries: a non-parametric analysis

    OpenAIRE

    Aristovnik, Aleksander

    2012-01-01

    The purpose of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Moreover, a definition, measurements and the empirical application of a model measuring the efficiency of ICT use and its impact at national levels will be considered. For this purpose, the Data Envelopment Analysis (DEA) technique is presented and then applied t...

  8. Noise filtering and nonparametric analysis of microarray data underscores discriminating markers of oral, prostate, lung, ovarian and breast cancer

    Directory of Open Access Journals (Sweden)

    Dermody James J

    2004-11-01

    Full Text Available Abstract Background A major goal of cancer research is to identify discrete biomarkers that specifically characterize a given malignancy. These markers are useful in diagnosis, may identify potential targets for drug development, and can aid in evaluating treatment efficacy and predicting patient outcome. Microarray technology has enabled marker discovery from human cells by permitting measurement of steady-state mRNA levels derived from thousands of genes. However many challenging and unresolved issues regarding the acquisition and analysis of microarray data remain, such as accounting for both experimental and biological noise, transcripts whose expression profiles are not normally distributed, guidelines for statistical assessment of false positive/negative rates and comparing data derived from different research groups. This study addresses these issues using Affymetrix HG-U95A and HG-U133 GeneChip data derived from different research groups. Results We present here a simple non parametric approach coupled with noise filtering to identify sets of genes differentially expressed between the normal and cancer states in oral, breast, lung, prostate and ovarian tumors. An important feature of this study is the ability to integrate data from different laboratories, improving the analytical power of the individual results. One of the most interesting findings is the down regulation of genes involved in tissue differentiation. Conclusions This study presents the development and application of a noise model that suppresses noise, limits false positives in the results, and allows integration of results from individual studies derived from different research groups.

  9. UN ANÁLISIS NO PARAMÉTRICO DE ÍTEMS DE LA PRUEBA DEL BENDER/A NONPARAMETRIC ITEM ANALYSIS OF THE BENDER GESTALT TEST MODIFIED

    Directory of Open Access Journals (Sweden)

    César Merino Soto

    2009-05-01

    Full Text Available Resumen:La presente investigación hace un estudio psicométrico de un nuevo sistema de calificación de la Prueba Gestáltica del Bendermodificada para niños, que es el Sistema de Calificación Cualitativa (Brannigan y Brunner, 2002, en un muestra de 244 niñosingresantes a primer grado de primaria en cuatro colegios públicos, ubicados en Lima. El enfoque usado es un análisis noparamétrico de ítems mediante el programa Testgraf (Ramsay, 1991. Los resultados indican niveles apropiados deconsistencia interna, identificándose la unidimensionalidad, y el buen nivel discriminativo de las categorías de calificación deeste Sistema Cualitativo. No se hallaron diferencias demográficas respecto al género ni la edad. Se discuten los presenteshallazgos en el contexto del potencial uso del Sistema de Calificación Cualitativa y del análisis no paramétrico de ítems en lainvestigación psicométrica.AbstracThis research designs a psychometric study of a new scoring system of the Bender Gestalt test modified to children: it is theQualitative Scoring System (Brannigan & Brunner, 2002, in a sample of 244 first grade children of primary level, in four public school of Lima. The approach aplied is the nonparametric item analysis using The test graft computer program (Ramsay, 1991. Our findings point to good levels of internal consistency, unidimensionality and good discriminative level ofthe categories of scoring from the Qualitative Scoring System. There are not demographic differences between gender or age.We discuss our findings within the context of the potential use of the Qualitative Scoring System and of the nonparametricitem analysis approach in the psychometric research.

  10. HOMOGENEOUS UGRIZ PHOTOMETRY FOR ACS VIRGO CLUSTER SURVEY GALAXIES: A NON-PARAMETRIC ANALYSIS FROM SDSS IMAGING

    International Nuclear Information System (INIS)

    Chen, Chin-Wei; Cote, Patrick; Ferrarese, Laura; West, Andrew A.; Peng, Eric W.

    2010-01-01

    We present photometric and structural parameters for 100 ACS Virgo Cluster Survey (ACSVCS) galaxies based on homogeneous, multi-wavelength (ugriz), wide-field SDSS (DR5) imaging. These early-type galaxies, which trace out the red sequence in the Virgo Cluster, span a factor of nearly ∼10 3 in g-band luminosity. We describe an automated pipeline that generates background-subtracted mosaic images, masks field sources and measures mean shapes, total magnitudes, effective radii, and effective surface brightnesses using a model-independent approach. A parametric analysis of the surface brightness profiles is also carried out to obtain Sersic-based structural parameters and mean galaxy colors. We compare the galaxy parameters to those in the literature, including those from the ACSVCS, finding good agreement in most cases, although the sizes of the brightest, and most extended, galaxies are found to be most uncertain and model dependent. Our photometry provides an external measurement of the random errors on total magnitudes from the widely used Virgo Cluster Catalog, which we estimate to be σ(B T )∼ 0.13 mag for the brightest galaxies, rising to ∼ 0.3 mag for galaxies at the faint end of our sample (B T ∼ 16). The distribution of axial ratios of low-mass ( d warf ) galaxies bears a strong resemblance to the one observed for the higher-mass ( g iant ) galaxies. The global structural parameters for the full galaxy sample-profile shape, effective radius, and mean surface brightness-are found to vary smoothly and systematically as a function of luminosity, with unmistakable evidence for changes in structural homology along the red sequence. As noted in previous studies, the ugriz galaxy colors show a nonlinear but smooth variation over a ∼7 mag range in absolute magnitude, with an enhanced scatter for the faintest systems that is likely the signature of their more diverse star formation histories.

  11. Energy efficiency determinants: An empirical analysis of Spanish innovative firms

    International Nuclear Information System (INIS)

    Costa-Campi, María Teresa; García-Quevedo, José; Segarra, Agustí

    2015-01-01

    This paper examines the extent to which innovative Spanish firms pursue improvements in energy efficiency (EE) as an objective of innovation. The increase in energy consumption and its impact on greenhouse gas emissions justifies the greater attention being paid to energy efficiency and especially to industrial EE. The ability of manufacturing companies to innovate and improve their EE has a substantial influence on attaining objectives regarding climate change mitigation. Despite the effort to design more efficient energy policies, the EE determinants in manufacturing firms have been little studied in the empirical literature. From an exhaustive sample of Spanish manufacturing firms and using a logit model, we examine the energy efficiency determinants for those firms that have innovated. To carry out the econometric analysis, we use panel data from the Community Innovation Survey for the period 2008–2011. Our empirical results underline the role of size among the characteristics of firms that facilitate energy efficiency innovation. Regarding company behaviour, firms that consider the reduction of environmental impacts to be an important objective of innovation and that have introduced organisational innovations are more likely to innovate with the objective of increasing energy efficiency. -- Highlights: •Drivers of innovation in energy efficiency at firm-level are examined. •Tangible investments have a greater influence on energy efficiency than R&D. •Environmental and energy efficiency innovation objectives are complementary. •Organisational innovation favors energy efficiency innovation. •Public policies should be implemented to improve firms’ energy efficiency

  12. Environmental pressure group strength and air pollution. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Seth; Neumayer, Eric [Department of Geography and Environment and Center for Environmental Policy and Governance (CEPG), London School of Economics and Political Science, Houghton Street, London WC2A 2AE (United Kingdom)

    2005-12-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world.

  13. Environmental pressure group strength and air pollution. An empirical analysis

    International Nuclear Information System (INIS)

    Binder, Seth; Neumayer, Eric

    2005-01-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world

  14. Refined discrete and empirical horizontal gradients in VLBI analysis

    Science.gov (United States)

    Landskron, Daniel; Böhm, Johannes

    2018-02-01

    Missing or incorrect consideration of azimuthal asymmetry of troposphere delays is a considerable error source in space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). So-called horizontal troposphere gradients are generally utilized for modeling such azimuthal variations and are particularly required for observations at low elevation angles. Apart from estimating the gradients within the data analysis, which has become common practice in space geodetic techniques, there is also the possibility to determine the gradients beforehand from different data sources than the actual observations. Using ray-tracing through Numerical Weather Models (NWMs), we determined discrete gradient values referred to as GRAD for VLBI observations, based on the standard gradient model by Chen and Herring (J Geophys Res 102(B9):20489-20502, 1997. https://doi.org/10.1029/97JB01739) and also for new, higher-order gradient models. These gradients are produced on the same data basis as the Vienna Mapping Functions 3 (VMF3) (Landskron and Böhm in J Geod, 2017.https://doi.org/10.1007/s00190-017-1066-2), so they can also be regarded as the VMF3 gradients as they are fully consistent with each other. From VLBI analyses of the Vienna VLBI and Satellite Software (VieVS), it becomes evident that baseline length repeatabilities (BLRs) are improved on average by 5% when using a priori gradients GRAD instead of estimating the gradients. The reason for this improvement is that the gradient estimation yields poor results for VLBI sessions with a small number of observations, while the GRAD a priori gradients are unaffected from this. We also developed a new empirical gradient model applicable for any time and location on Earth, which is included in the Global Pressure and Temperature 3 (GPT3) model. Although being able to describe only the systematic component of azimuthal asymmetry and no short-term variations at all, even these

  15. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  16. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  17. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  18. A sensitivity analysis of centrifugal compressors' empirical models

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Baek, Je Hyun

    2001-01-01

    The mean-line method using empirical models is the most practical method of predicting off-design performance. To gain insight into the empirical models, the influence of empirical models on the performance prediction results is investigated. We found that, in the two-zone model, the secondary flow mass fraction has a considerable effect at high mass flow-rates on the performance prediction curves. In the TEIS model, the first element changes the slope of the performance curves as well as the stable operating range. The second element makes the performance curves move up and down as it increases or decreases. It is also discovered that the slip factor affects pressure ratio, but it has little effect on efficiency. Finally, this study reveals that the skin friction coefficient has significant effect on both the pressure ratio curve and the efficiency curve. These results show the limitations of the present empirical models, and more reasonable empirical models are reeded

  19. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Camara Vincent A. R.

    1998-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.

  20. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Vincent A. R. Camara

    1999-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.

  1. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    International Nuclear Information System (INIS)

    Jeffrey Joe; Larry G. Blackwood

    2006-01-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant's Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results

  2. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO.

  3. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  4. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  5. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  6. Empirical Analysis of Closed-Loop Duopoly Advertising Strategies

    OpenAIRE

    Gary M. Erickson

    1992-01-01

    Closed-loop (perfect) equilibria in a Lanchester duopoly differential game of advertising competition are used as the basis for empirical investigation. Two systems of simultaneous nonlinear equations are formed, one from a general Lanchester model and one from a constrained model. Two empirical applications are conducted. In one involving Coca-Cola and Pepsi-Cola, a formal statistical testing procedure is used to detect whether closed-loop equilibrium advertising strategies are used by the c...

  7. Differences in Dynamic Brand Competition Across Markets: An Empirical Analysis

    OpenAIRE

    Jean-Pierre Dubé; Puneet Manchanda

    2005-01-01

    We investigate differences in the dynamics of marketing decisions across geographic markets empirically. We begin with a linear-quadratic game involving forward-looking firms competing on prices and advertising. Based on the corresponding Markov perfect equilibrium, we propose estimable econometric equations for demand and marketing policy. Our model allows us to measure empirically the strategic response of competitors along with economic measures such as firm profitability. We use a rich da...

  8. Nonparametric tests for equality of psychometric functions.

    Science.gov (United States)

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2017-12-07

    Many empirical studies measure psychometric functions (curves describing how observers' performance varies with stimulus magnitude) because these functions capture the effects of experimental conditions. To assess these effects, parametric curves are often fitted to the data and comparisons are carried out by testing for equality of mean parameter estimates across conditions. This approach is parametric and, thus, vulnerable to violations of the implied assumptions. Furthermore, testing for equality of means of parameters may be misleading: Psychometric functions may vary meaningfully across conditions on an observer-by-observer basis with no effect on the mean values of the estimated parameters. Alternative approaches to assess equality of psychometric functions per se are thus needed. This paper compares three nonparametric tests that are applicable in all situations of interest: The existing generalized Mantel-Haenszel test, a generalization of the Berry-Mielke test that was developed here, and a split variant of the generalized Mantel-Haenszel test also developed here. Their statistical properties (accuracy and power) are studied via simulation and the results show that all tests are indistinguishable as to accuracy but they differ non-uniformly as to power. Empirical use of the tests is illustrated via analyses of published data sets and practical recommendations are given. The computer code in MATLAB and R to conduct these tests is available as Electronic Supplemental Material.

  9. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus.

    Directory of Open Access Journals (Sweden)

    Steven M Carr

    -stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  10. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  11. Competition in the German pharmacy market: an empirical analysis.

    Science.gov (United States)

    Heinsohn, Jörg G; Flessa, Steffen

    2013-10-10

    Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an offensive and competitive stance are quite

  12. An empirical analysis of the hydropower portfolio in Pakistan

    International Nuclear Information System (INIS)

    Siddiqi, Afreen; Wescoat, James L.; Humair, Salal; Afridi, Khurram

    2012-01-01

    The Indus Basin of Pakistan with 800 hydropower project sites and a feasible hydropower potential of 60 GW, 89% of which is undeveloped, is a complex system poised for large-scale changes in the future. Motivated by the need to understand future impacts of hydropower alternatives, this study conducted a multi-dimensional, empirical analysis of the full hydropower portfolio. The results show that the full portfolio spans multiple scales of capacity from mega (>1000 MW) to micro (<0.1 MW) projects with a skewed spatial distribution within the provinces, as well as among rivers and canals. Of the total feasible potential, 76% lies in two (out of six) administrative regions and 68% lies in two major rivers (out of more than 125 total channels). Once projects currently under implementation are commissioned, there would be a five-fold increase from a current installed capacity of 6720 MW to 36759 MW. It is recommended that the implementation and design decisions should carefully include spatial distribution and environmental considerations upfront. Furthermore, uncertainties in actual energy generation, and broader hydrological risks due to expected climate change effects should be included in the current planning of these systems that are to provide service over several decades into the future. - Highlights: ► Pakistan has a hydropower potential of 60 GW distributed across 800 projects. ► Under-development projects will realize 36.7 GW of this potential by 2030. ► Project locations are skewed towards some sub-basins and provinces. ► Project sizes are very diverse and have quite limited private sector ownership. ► Gaps in data prevent proper risk assessment for Pakistan's hydropower development.

  13. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles’ search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles’ premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO. PMID:28472050

  14. Modeling gallic acid production rate by empirical and statistical analysis

    Directory of Open Access Journals (Sweden)

    Bratati Kar

    2000-01-01

    Full Text Available For predicting the rate of enzymatic reaction empirical correlation based on the experimental results obtained under various operating conditions have been developed. Models represent both the activation as well as deactivation conditions of enzymatic hydrolysis and the results have been analyzed by analysis of variance (ANOVA. The tannase activity was found maximum at incubation time 5 min, reaction temperature 40ºC, pH 4.0, initial enzyme concentration 0.12 v/v, initial substrate concentration 0.42 mg/ml, ionic strength 0.2 M and under these optimal conditions, the maximum rate of gallic acid production was 33.49 mumoles/ml/min.Para predizer a taxa das reações enzimaticas uma correlação empírica baseada nos resultados experimentais foi desenvolvida. Os modelos representam a ativação e a desativativação da hydrolise enzimatica. Os resultados foram avaliados pela análise de variança (ANOVA. A atividade máxima da tannase foi obtida após 5 minutos de incubação, temperatura 40ºC, pH 4,0, concentração inicial da enzima de 0,12 v/v, concentração inicial do substrato 0,42 mg/ml, força iônica 0,2 M. Sob essas condições a taxa máxima de produção ácido galico foi de 33,49 µmoles/ml/min.

  15. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Directory of Open Access Journals (Sweden)

    Danping Yan

    Full Text Available Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO, we herein propose a chaotic proportional integral derivative (PID controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA and PSO.

  16. An empirical analysis of cigarette demand in Argentina.

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2015-01-01

    To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to -0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was -0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  18. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  19. Application of nonparametric statistic method for DNBR limit calculation

    International Nuclear Information System (INIS)

    Dong Bo; Kuang Bo; Zhu Xuenong

    2013-01-01

    Background: Nonparametric statistical method is a kind of statistical inference method not depending on a certain distribution; it calculates the tolerance limits under certain probability level and confidence through sampling methods. The DNBR margin is one important parameter of NPP design, which presents the safety level of NPP. Purpose and Methods: This paper uses nonparametric statistical method basing on Wilks formula and VIPER-01 subchannel analysis code to calculate the DNBR design limits (DL) of 300 MW NPP (Nuclear Power Plant) during the complete loss of flow accident, simultaneously compared with the DL of DNBR through means of ITDP to get certain DNBR margin. Results: The results indicate that this method can gain 2.96% DNBR margin more than that obtained by ITDP methodology. Conclusions: Because of the reduction of the conservation during analysis process, the nonparametric statistical method can provide greater DNBR margin and the increase of DNBR margin is benefited for the upgrading of core refuel scheme. (authors)

  20. Ownership dynamics with large shareholders : An empirical analysis

    NARCIS (Netherlands)

    Donelli, M.; Urzua Infante, F.; Larrain, B.

    2013-01-01

    We study the empirical determinants of corporate ownership dynamics in a market where large shareholders are prevalent. We use a unique, hand-collected 20-year dataset on the ownership structure of Chilean companies. Controllers’ blockholdings are on average high -as in continental Europe, for

  1. WHAT FACTORS INFLUENCE QUALITY SERVICE IMPROVEMENT IN MONTENEGRO: EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Djurdjica Perovic

    2013-03-01

    Full Text Available In this paper, using an Ordinary Least Square regression (OLS, we investigate whether intangible elements influence tourist's perception about service quality. Our empirical results based on tourist survey in Montenegro, indicate that intangible elements of tourism product have positive impact on tourist's overall perception of service quality in Montenegro.

  2. Opposing the nuclear threat: The convergence of moral analysis and empirical data

    International Nuclear Information System (INIS)

    Hehir, J.B.

    1986-01-01

    This paper examines the concept of nuclear winter from the perspective of religious and moral values. The objective is to identify points of intersection between the empirical arguments about nuclear winter and ethical perspectives on nuclear war. The analysis moves through three steps: (1) the context of the nuclear debate; (2) the ethical and empirical contributions to the nuclear debate; and (3) implications for policy drawn from the ethical-empirical data

  3. Separating environmental efficiency into production and abatement efficiency. A nonparametric model with application to U.S. power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin

    2011-08-15

    In this paper we present a new approach to evaluate the environmental efficiency of decision making units. We propose a model that describes a two-stage process consisting of a production and an end-of-pipe abatement stage with the environmental efficiency being determined by the efficiency of both stages. Taking the dependencies between the two stages into account, we show how nonparametric methods can be used to measure environmental efficiency and to decompose it into production and abatement efficiency. For an empirical illustration we apply our model to an analysis of U.S. power plants.

  4. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  5. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  6. Economic Growth and the Environment. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    De Bruyn, S.M.

    1999-12-21

    A number of economists have claimed that economic growth benefits environmental quality as it raises political support and financial means for environmental policy measures. Since the early 1990s this view has increasingly been supported by empirical evidence that has challenged the traditional belief held by environmentalists that economic growth degrades the environment. This study investigates the relationship between economic growth and environmental quality and elaborates the question whether economic growth can be combined with a reduced demand for natural resources. Various hypotheses on this relationship are described and empirically tested for a number of indicators of environmental pressure. The outcome of the tests advocates the use of alternative models for estimation that alter conclusions about the relationship between economic growth and the environment and give insight into the driving forces of emission reduction in developed economies. refs.

  7. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  8. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    International Nuclear Information System (INIS)

    Georgiev, E.

    2008-09-01

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  9. Does gender equality promote social trust? An empirical analysis

    OpenAIRE

    Seo-Young Cho

    2015-01-01

    Fairness can be an important factor that promotes social trust among people. In this paper, I investigate empirically whether fairness between men and women increases social trust. Using the data of the World Value Survey from 91 countries, I find that gender discriminatory values negatively affect the trust level of both men and women, while actual conditions on gender equality, measured by labor and educational attainments and political participation, are not a significant determinant of so...

  10. Tax morale : theory and empirical analysis of tax compliance

    OpenAIRE

    Torgler, Benno

    2003-01-01

    Tax morale is puzzling in our society. Observations show that tax compliance cannot be satisfactorily explained by the level of enforcement. Other factors may well be relevant. This paper contains a short survey of important theoretical and empirical findings in the tax morale literature, focussing on personal income tax morale. The following three key topics are discussed: moral sentiments, fairness and the relationship between taxpayer and government. The survey stresses the ...

  11. Empirical fractal geometry analysis of some speculative financial bubbles

    Science.gov (United States)

    Redelico, Francisco O.; Proto, Araceli N.

    2012-11-01

    Empirical evidence of a multifractal signature during increasing of a financial bubble leading to a crash is presented. The April 2000 crash in the NASDAQ composite index and a time series from the discrete Chakrabarti-Stinchcombe model for earthquakes are analyzed using a geometric approach and some common patterns are identified. These patterns can be related the geometry of the rising period of a financial bubbles with the non-concave entropy problem.

  12. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  13. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ansuategi, A. [Ekonomi Analisiaren Oinarriak I Saila, Ekonomi Zientzien Fakultatea, Lehendakari Agirre Etorbidea, 83, 48015 Bilbao (Spain)

    2003-10-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe.

  14. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    International Nuclear Information System (INIS)

    Ansuategi, A.

    2003-01-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe

  15. Development of an empirical model of turbine efficiency using the Taylor expansion and regression analysis

    International Nuclear Information System (INIS)

    Fang, Xiande; Xu, Yu

    2011-01-01

    The empirical model of turbine efficiency is necessary for the control- and/or diagnosis-oriented simulation and useful for the simulation and analysis of dynamic performances of the turbine equipment and systems, such as air cycle refrigeration systems, power plants, turbine engines, and turbochargers. Existing empirical models of turbine efficiency are insufficient because there is no suitable form available for air cycle refrigeration turbines. This work performs a critical review of empirical models (called mean value models in some literature) of turbine efficiency and develops an empirical model in the desired form for air cycle refrigeration, the dominant cooling approach in aircraft environmental control systems. The Taylor series and regression analysis are used to build the model, with the Taylor series being used to expand functions with the polytropic exponent and the regression analysis to finalize the model. The measured data of a turbocharger turbine and two air cycle refrigeration turbines are used for the regression analysis. The proposed model is compact and able to present the turbine efficiency map. Its predictions agree with the measured data very well, with the corrected coefficient of determination R c 2 ≥ 0.96 and the mean absolute percentage deviation = 1.19% for the three turbines. -- Highlights: → Performed a critical review of empirical models of turbine efficiency. → Developed an empirical model in the desired form for air cycle refrigeration, using the Taylor expansion and regression analysis. → Verified the method for developing the empirical model. → Verified the model.

  16. Parametric vs. Nonparametric Regression Modelling within Clinical Decision Support

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2017-01-01

    Roč. 5, č. 1 (2017), s. 21-27 ISSN 1805-8698 R&D Projects: GA ČR GA17-01251S Institutional support: RVO:67985807 Keywords : decision support systems * decision rules * statistical analysis * nonparametric regression Subject RIV: IN - Informatics, Computer Science OBOR OECD: Statistics and probability

  17. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  18. EMPIRICAL ANALYSIS OF THE ROLE OF THE FIRMS’ VALUE DRIVERS

    Directory of Open Access Journals (Sweden)

    Anita KISS

    2015-12-01

    Full Text Available This paper focuses on the value creation and the value drivers. One of the objectives of this paper is to present the concept of maximizing shareholder value. The main goal is to categorize the most important value drivers, and their role of the firms’ value. This study proceeds as follows. The first section presents the value chain, the primary and the support activities. The second part describes the theoretical background of maximizing shareholder value. The third part illustrates the key value drivers. The fourth empirical section of the study analyses the database featuring data from 18 European countries, 10 sectors and 1553 firms in the period between 2004 and 2011. Finally, the fifth section includes concluding remarks. Based on the related literature reviewed and in the conducted empirical research it can be assessed that, the EBIT, reinvestment, invested capital, the return on invested capital, the net margin and the sales growth rate all have a positive effect on firm value, while the tax rate and the market value of return on asset (MROA has a negative one.

  19. On the use of permutation in and the performance of a class of nonparametric methods to detect differential gene expression.

    Science.gov (United States)

    Pan, Wei

    2003-07-22

    Recently a class of nonparametric statistical methods, including the empirical Bayes (EB) method, the significance analysis of microarray (SAM) method and the mixture model method (MMM), have been proposed to detect differential gene expression for replicated microarray experiments conducted under two conditions. All the methods depend on constructing a test statistic Z and a so-called null statistic z. The null statistic z is used to provide some reference distribution for Z such that statistical inference can be accomplished. A common way of constructing z is to apply Z to randomly permuted data. Here we point our that the distribution of z may not approximate the null distribution of Z well, leading to possibly too conservative inference. This observation may apply to other permutation-based nonparametric methods. We propose a new method of constructing a null statistic that aims to estimate the null distribution of a test statistic directly. Using simulated data and real data, we assess and compare the performance of the existing method and our new method when applied in EB, SAM and MMM. Some interesting findings on operating characteristics of EB, SAM and MMM are also reported. Finally, by combining the idea of SAM and MMM, we outline a simple nonparametric method based on the direct use of a test statistic and a null statistic.

  20. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  1. An empirical analysis of the corporate call decision

    International Nuclear Information System (INIS)

    Carlson, M.D.

    1998-01-01

    An economic study of the the behaviour of financial managers of utility companies was presented. The study examined whether or not an option pricing based model of the call decision does a better job of explaining callable preferred share prices and call decisions compared to other models. In this study, the Rust (1987) empirical technique was extended to include the use of information from preferred share prices in addition to the call decisions. Reasonable estimates were obtained from data of shares of the Pacific Gas and Electric Company (PGE) for the transaction costs associated with a call. It was concluded that the managers of the PGE clearly take into account the value of the option to delay the call when making their call decisions

  2. Empirical Analysis of Using Erasure Coding in Outsourcing Data Storage With Provable Security

    Science.gov (United States)

    2016-06-01

    computing and communication technologies become powerful and advanced , people are exchanging a huge amount of data, and they are de- manding more storage...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGEWITH PROVABLE SECURITY by...2015 to 06-17-2016 4. TITLE AND SUBTITLE EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGE WITH PROVABLE SECURITY 5. FUNDING

  3. Population, internal migration, and economic growth: an empirical analysis.

    Science.gov (United States)

    Moreland, R S

    1982-01-01

    The role of population growth in the development process has received increasing attention during the last 15 years, as manifested in the literature in 3 broad categories. In the 1st category, the effects of rapid population growth on the growth of income have been studied with the use of simulation models, which sometimes include endogenous population growth. The 2nd category of the literature is concerned with theoretical and empirical studies of the economic determinants of various demographic rates--most usually fertility. Internal migration and dualism is the 3rd population development category to recieve attention. An attempt is made to synthesize developments in these 3 categories by estimating from a consistent set of data a 2 sector economic demographic model in which the major demographic rates are endogenous. Due to the fact that the interactions between economic and demographic variables are nonlinear and complex, the indirect effects of changes in a particular variable may depend upon the balance of numerical coefficients. For this reason it was felt that the model should be empirically grounded. A brief overview of the model is provided, and the model is compared to some similar existing models. Estimation of the model's 9 behavior equations is discussed, followed by a "base run" simulation of a developing country "stereotype" and a report of a number of policy experiments. The relatively new field of economic determinants of demographic variables was drawn upon in estimating equations to endogenize demographic phenomena that are frequently left exogenous in simulation models. The fertility and labor force participation rate functions are fairly standard, but a step beyong existing literature was taken in the life expectancy and intersectorial migration equations. On the economic side, sectoral savings functions were estimated, and it was found that the marginal propensity to save is lower in agriculture than in nonagriculture. Testing to see the

  4. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    (EOF) analysis, which provides a non-temporal analysis of one variable over time. The temporal extension proves its strength in separating the signals at different periods in an analysis of relevant oceanographic properties related to one of the largest El Niño events ever recorded....

  5. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  6. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  7. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  8. Environmentalism and elitism: a conceptual and empirical analysis

    Science.gov (United States)

    Morrison, Denton E.; Dunlap, Riley E.

    1986-09-01

    The frequent charge that environmentalism is “elitist” is examined conceptually and empirically. First, the concept of elitism is analyzed by distinguishing between three types of accusations made against the environmental movement: (a) compositional elitism suggests that environmentalists are drawn from privileged socioeconomic strata, (b) ideological elitism suggests that environmental reforms are a subterfuge for distributing benefits to environmentalists and/or costs to others, and (c) impact elitism suggests that environmental reforms, whether intentionally or not, do in fact have regressive social impacts. The evidence bearing on each of the three types of elitism is examined in some detail, and the following conclusions are drawn: Compositional elitism is an exaggeration, for although environmentalists are typically above average in socioeconomic status (as are most sociopolitical activists), few belong to the upper class. Ideological elitism may hold in some instances, but environmentalists have shown increasing sensitivity to equity concerns and there is little evidence of consistent pursuit of self-interest. Impact elitism is the most important issue, and also the most difficult to assess. It appears that there has been a general tendency for environmental reforms to have regressive impacts. However, it is increasingly recognized that problems such as workplace pollution and toxic waste contamination disproportionately affect the lower socioeconomic strata, and thus reforms aimed at such problems will likely have more progressive impacts.

  9. Production functions for climate policy modeling. An empirical analysis

    International Nuclear Information System (INIS)

    Van der Werf, Edwin

    2008-01-01

    Quantitative models for climate policy modeling differ in the production structure used and in the sizes of the elasticities of substitution. The empirical foundation for both is generally lacking. This paper estimates the parameters of 2-level CES production functions with capital, labour and energy as inputs, and is the first to systematically compare all nesting structures. Using industry-level data from 12 OECD countries, we find that the nesting structure where capital and labour are combined first, fits the data best, but for most countries and industries we cannot reject that all three inputs can be put into one single nest. These two nesting structures are used by most climate models. However, while several climate policy models use a Cobb-Douglas function for (part of the) production function, we reject elasticities equal to one, in favour of considerably smaller values. Finally we find evidence for factor-specific technological change. With lower elasticities and with factor-specific technological change, some climate policy models may find a bigger effect of endogenous technological change on mitigating the costs of climate policy. (author)

  10. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  11. Empirical Comparison of Publication Bias Tests in Meta-Analysis.

    Science.gov (United States)

    Lin, Lifeng; Chu, Haitao; Murad, Mohammad Hassan; Hong, Chuan; Qu, Zhiyong; Cole, Stephen R; Chen, Yong

    2018-04-16

    Decision makers rely on meta-analytic estimates to trade off benefits and harms. Publication bias impairs the validity and generalizability of such estimates. The performance of various statistical tests for publication bias has been largely compared using simulation studies and has not been systematically evaluated in empirical data. This study compares seven commonly used publication bias tests (i.e., Begg's rank test, trim-and-fill, Egger's, Tang's, Macaskill's, Deeks', and Peters' regression tests) based on 28,655 meta-analyses available in the Cochrane Library. Egger's regression test detected publication bias more frequently than other tests (15.7% in meta-analyses of binary outcomes and 13.5% in meta-analyses of non-binary outcomes). The proportion of statistically significant publication bias tests was greater for larger meta-analyses, especially for Begg's rank test and the trim-and-fill method. The agreement among Tang's, Macaskill's, Deeks', and Peters' regression tests for binary outcomes was moderately strong (most κ's were around 0.6). Tang's and Deeks' tests had fairly similar performance (κ > 0.9). The agreement among Begg's rank test, the trim-and-fill method, and Egger's regression test was weak or moderate (κ < 0.5). Given the relatively low agreement between many publication bias tests, meta-analysts should not rely on a single test and may apply multiple tests with various assumptions. Non-statistical approaches to evaluating publication bias (e.g., searching clinical trials registries, records of drug approving agencies, and scientific conference proceedings) remain essential.

  12. Empirical Green's function analysis: Taking the next step

    Science.gov (United States)

    Hough, S.E.

    1997-01-01

    An extension of the empirical Green's function (EGF) method is presented that involves determination of source parameters using standard EGF deconvolution, followed by inversion for a common attenuation parameter for a set of colocated events. Recordings of three or more colocated events can thus be used to constrain a single path attenuation estimate. I apply this method to recordings from the 1995-1996 Ridgecrest, California, earthquake sequence; I analyze four clusters consisting of 13 total events with magnitudes between 2.6 and 4.9. I first obtain corner frequencies, which are used to infer Brune stress drop estimates. I obtain stress drop values of 0.3-53 MPa (with all but one between 0.3 and 11 MPa), with no resolved increase of stress drop with moment. With the corner frequencies constrained, the inferred attenuation parameters are very consistent; they imply an average shear wave quality factor of approximately 20-25 for alluvial sediments within the Indian Wells Valley. Although the resultant spectral fitting (using corner frequency and ??) is good, the residuals are consistent among the clusters analyzed. Their spectral shape is similar to the the theoretical one-dimensional response of a layered low-velocity structure in the valley (an absolute site response cannot be determined by this method, because of an ambiguity between absolute response and source spectral amplitudes). I show that even this subtle site response can significantly bias estimates of corner frequency and ??, if it is ignored in an inversion for only source and path effects. The multiple-EGF method presented in this paper is analogous to a joint inversion for source, path, and site effects; the use of colocated sets of earthquakes appears to offer significant advantages in improving resolution of all three estimates, especially if data are from a single site or sites with similar site response.

  13. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  14. 188 An Empirical Investigation of Value-Chain Analysis and ...

    African Journals Online (AJOL)

    User

    Analysis has a positive but insignificant impact on Competitive Advantage of a manufacturing firm in ... chain analysis is a means of increasing customer satisfaction and managing costs more ... The linkages express the relationships between the ... margins, return on assets, benchmarking, and capital budgeting. When a.

  15. Empirical Risk Analysis of Severe Reactor Accidents in Nuclear Power Plants after Fukushima

    OpenAIRE

    Kaiser, Jan Christian

    2012-01-01

    Many countries are reexamining the risks connected with nuclear power generation after the Fukushima accidents. To provide updated information for the corresponding discussion a simple empirical approach is applied for risk quantification of severe reactor accidents with International Nuclear and Radiological Event Scale (INES) level ≥5. The analysis is based on worldwide data of commercial nuclear facilities. An empirical hazard of 21 (95% confidence intervals (CI) 4; 62) severe accidents am...

  16. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  17. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  18. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  19. Empirical analysis of scaling and fractal characteristics of outpatients

    International Nuclear Information System (INIS)

    Zhang, Li-Jiang; Liu, Zi-Xian; Guo, Jin-Li

    2014-01-01

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  20. Empirical Analysis of Pneumatic Tire Friction on Ice

    OpenAIRE

    Holley, Troy Nigel

    2010-01-01

    Pneumatic tire friction on ice is an under-researched area of tire mechanics. This study covers the design and analysis of a series of pneumatic tire tests on a flat-level ice road surface. The terramechanics rig of the Advanced Vehicle Dynamics Lab (AVDL) is a single-wheel test rig that allows for the experimental analysis of the forces and moments on a tire, providing directly the data for the drawbar pull of said tire, thus supporting the calculation of friction based on this data. This...

  1. Formal analysis of empirical traces in incident management

    International Nuclear Information System (INIS)

    Hoogendoorn, Mark; Jonker, Catholijn M.; Maanen, Peter-Paul van; Sharpanskykh, Alexei

    2008-01-01

    Within the field of incident management split second decisions have to be made, usually on the basis of incomplete and partially incorrect information. As a result of these conditions, errors occur in such decision processes. In order to avoid repetition of such errors, historic cases, disaster plans, and training logs need to be thoroughly analysed. This paper presents a formal approach for such an analysis that pays special attention to spatial and temporal aspects, to information exchange, and to organisational structure. The formal nature of the approach enables automation of analysis, which is illustrated by case studies of two disasters

  2. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    OpenAIRE

    Jamal J Almenayes

    2015-01-01

    This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996).  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Result...

  3. Empirical analysis of scaling and fractal characteristics of outpatients

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Li-Jiang, E-mail: zljjiang@gmail.com [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Management Institute, Xinxiang Medical University, Xinxiang 453003, Henan (China); Liu, Zi-Xian, E-mail: liuzixian@tju.edu.cn [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Guo, Jin-Li, E-mail: phd5816@163.com [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2014-01-31

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  4. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  5. Essays on nonparametric econometrics of stochastic volatility

    NARCIS (Netherlands)

    Zu, Y.

    2012-01-01

    Volatility is a concept that describes the variation of financial returns. Measuring and modelling volatility dynamics is an important aspect of financial econometrics. This thesis is concerned with nonparametric approaches to volatility measurement and volatility model validation.

  6. Nonparametric methods for volatility density estimation

    NARCIS (Netherlands)

    Es, van Bert; Spreij, P.J.C.; Zanten, van J.H.

    2009-01-01

    Stochastic volatility modelling of financial processes has become increasingly popular. The proposed models usually contain a stationary volatility process. We will motivate and review several nonparametric methods for estimation of the density of the volatility process. Both models based on

  7. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    Science.gov (United States)

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  8. Unemployment and Labor Market Institutions : An Empirical Analysis

    NARCIS (Netherlands)

    Belot, M.V.K.; van Ours, J.C.

    2001-01-01

    The development of the unemployment rate di¤ers substantially between OECD countries.In this paper we investigate to what extent these differences are related to labor market institutions.In our analysis we use data of eighteen OECD countries over the period 1960-1994 and show that the way in which

  9. An Empirical Study of Precise Interprocedural Array Analysis

    Directory of Open Access Journals (Sweden)

    Michael Hind

    1994-01-01

    Full Text Available In this article we examine the role played by the interprocedural analysis of array accesses in the automatic parallelization of Fortran programs. We use the PTRAN system to provide measurements of several benchmarks to compare different methods of representing interprocedurally accessed arrays. We examine issues concerning the effectiveness of automatic parallelization using these methods and the efficiency of a precise summarization method.

  10. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    Directory of Open Access Journals (Sweden)

    Jamal J Almenayes

    2015-10-01

    Full Text Available This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996.  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Results indicated that social media addiction had three factors; "Social Consequences", "Time Displacement" and "Compulsive feelings.  Religiosity, on the other hand, contained a single factor.  Both of these results were arrived at using factor analysis of their respective scales. The relationship between religiosity and social media addiction was then examined using linear regression.  The results indicated that only two of the addiction factors were significantly related to religiosity.  Future research should address the operationalization of the concept of religiosity to account for multiple dimensions.

  11. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  12. Tourism Competitiveness Index – An Empirical Analysis Romania vs. Bulgaria

    Directory of Open Access Journals (Sweden)

    Mihai CROITORU

    2011-09-01

    Full Text Available In the conditions of the current economic downturn, many specialists consider tourism as one of the sectors with the greatest potential to provide worldwide economic growth and development. A growing tourism sector can contribute effectively to employment, increase national income, and can also make a decisive mark on the balance of payments. Thus, tourism can be an important driving force for growth and prosperity, especially in emerging economies, being a key element in reducing poverty and regional disparities. Despite its contribution to economic growth, tourism sector development can be undermined by a series of economic and legislative barriers that can affect the competitiveness of this sector. In this context, the World Economic Forum proposes, via the Tourism Competitiveness Index (TCI, in addition to a methodology to identify key factors that contribute to increasing tourism competitiveness, tools for analysis and evaluation of these factors. In this context, this paper aims to analyze the underlying determinants of TCI from the perspective of two directly competing states, Romania and Bulgaria in order to highlight the effects of communication on the competitiveness of the tourism sector. The purpose of this analysis is to provide some answers, especially in terms of communication strategies, which may explain the completely different performances of the two national economies in the tourism sector.

  13. IFRS and Stock Returns: An Empirical Analysis in Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo F. Malaquias

    2016-09-01

    Full Text Available In recent years, the convergence of accounting standards has been an issue that motivated new studies in the accounting field. It is expected that the convergence provides users, especially external users of accounting information, with comparable reports among different economies. Considering this scenario, this article was developed in order to compare the effect of accounting numbers on the stock market before and after the accounting convergence in Brazil. The sample of the study involved Brazilian listed companies at BM&FBOVESPA that had American Depository Receipts (levels II and III at the New York Stock Exchange (NYSE. For data analysis, descriptive statistics and graphic analysis were employed in order to analyze the behavior of stock returns around the publication dates. The main results indicate that the stock market reacts to the accounting reports. Therefore, the accounting numbers contain relevant information for the decision making of investors in the stock market. Moreover, it is observed that after the accounting convergence, the stock returns of the companies seem to present lower volatility.

  14. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  15. Empirical analysis of the effects of cyber security incidents.

    Science.gov (United States)

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  16. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  17. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Analysis of trends in aviation maintenance risk: An empirical approach

    International Nuclear Information System (INIS)

    Marais, Karen B.; Robichaud, Matthew R.

    2012-01-01

    Safety is paramount in the airline industry. A significant amount of effort has been devoted to reducing mechanical failures and pilot errors. Recently, more attention has been devoted to the contribution of maintenance to accidents and incidents. This study investigates and quantifies the contribution of maintenance, both in terms of frequency and severity, to passenger airline risk by analyzing three different sources of data from 1999 to 2008: 769 NTSB accident reports, 3242 FAA incident reports, and 7478 FAA records of fines and other legal actions taken against airlines and associated organizations. We analyze several safety related metrics and develop an aviation maintenance risk scorecard that collects these metrics to synthesize a comprehensive track record of maintenance contribution to airline accidents and incidents. We found for example that maintenance-related accidents are approximately 6.5 times more likely to be fatal than accidents in general, and that when fatalities do occur, maintenance accidents result in approximately 3.6 times more fatalities on average. Our analysis of accident trends indicates that this contribution to accident risk has remained fairly constant over the past decade. Our analysis of incidents and FAA fines and legal actions also revealed similar trends. We found that at least 10% of incidents involving mechanical failures such as ruptured hydraulic lines can be attributed to maintenance, suggesting that there may be issues surrounding both the design of and compliance with maintenance plans. Similarly 36% of FAA fines and legal actions involve inadequate maintenance, with recent years showing a decline to about 20%, which may be a reflection of improved maintenance practices. Our results can aid industry and government in focusing resources to continue improving aviation safety.

  19. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  20. The Effect of Shocks: An Empirical Analysis of Ethiopia

    Directory of Open Access Journals (Sweden)

    Yilebes Addisu Damtie

    2015-07-01

    Full Text Available Besides striving for the increase of production and development, it is also necessary to reduce the losses created by the shocks. The people of Ethiopia are exposed to the impact of both natural and man-made shocks. Following this, policy makers, governmental and non-governmental organizations need to identify the important shocks and their effect and use as an input. This study was conducted to identify the food insecurity shocks and to estimate their effect based on the conceptual framework developed in Ethiopia, Amhara National Regional State of Libo Kemkem District. Descriptive statistical analysis, multiple regression, binary logistic regression, chi-squared and independent sample t-test were used as a data analysis technique. The results showed eight shocks affecting households which were weather variability, weed, plant insect and pest infestation, soil fertility problem, animal disease and epidemics, human disease and epidemics, price fluctuation problem and conflict. Weather variability, plant insect and pest infestation, weed, animal disease and epidemics created a mean loss of 3,821.38, 886.06, 508.04 and 1,418.32 Birr, respectively. In addition, human disease and epidemics, price fluctuation problem and conflict affected 68.11%, 88.11% and 14.59% of households, respectively. Among the sample households 28,1 % were not able to meet their food need throughout the year while 71,9 % could. The result of the multiple regression models revealed that weed existence (β = –0,142, p < 0,05, plant insect and pest infestation (β = –0,279, p < 0,01 and soil fertility problem (β = –0,321, p < 0,01 had significant effect on income. Asset was found significantly affected by plant insect and pest infestation (β = –0,229, p < 0,01, human disease and epidemics (β = 0,145, p < 0,05, and soil fertility problem (β = –0,317, p < 0,01 while food production was affected by soil fertility problem (β = –0,314, p < 0,01. Binary logistic

  1. An empirical study of tourist preferences using conjoint analysis

    Directory of Open Access Journals (Sweden)

    Tripathi, S.N.

    2010-01-01

    Full Text Available Tourism and hospitality have become key global economic activities as expectations with regard to our use of leisure time have evolved, attributing greater meaning to our free time. While the growth in tourism has been impressive, India's share in total global tourism arrivals and earnings is quite insignificant. It is an accepted fact that India has tremendous potential for development of tourism. This anomaly and the various underlying factors responsible for it are the focus of our study. The objective being determination of customer preferences for multi attribute hybrid services like tourism, so as to enable the state tourism board to deliver a desired combination of intrinsic attributes, helping it to create a sustainable competitive advantage, leading to greater customer satisfaction and positive word of mouth. Conjoint analysis has been used for this purpose, which estimates the structure of a consumer’s preferences, given his/her overall evaluations of a set of alternatives that are pre-specified in terms of levels of different attributes.

  2. Empirical Analysis on CSR Communication in Romania: Transparency and Participation

    Directory of Open Access Journals (Sweden)

    Irina-Eugenia Iamandi

    2012-12-01

    Full Text Available In the specific field of corporate social responsibility (CSR, the participation of companies in supporting social and environmental issues is mainly analysed and/or measured based on their CSR communication policy; in this way, the transparency of the CSR reporting procedures is one of the most precise challenges for researchers and practitioners in the field. The main research objective of the present paper is to distinguish between different types of CSR participation by identifying the reasons behind CSR communication for a series of companies acting on the Romanian market. The descriptive analysis – conducted both at integrated and corporate level for the Romanian companies – took into account five main CSR communication related issues: CSR site, CSR report, CSR listing, CSR budget and CSR survey. The results highlight both the declarative/prescriptive and practical/descriptive perspectives of CSR communication in Romania, showing that the Romanian CSR market is reaching its full maturity. In more specific terms, the majority of the investigated companies are already using different types of CSR participation, marking the transition from CSR just for commercial purposes to CSR for long-term strategic use. The achieved results are broadly analysed in the paper and specific conclusions are emphasized.

  3. Political determinants of social expenditures in Greece: an empirical analysis

    Directory of Open Access Journals (Sweden)

    Ebru Canikalp

    2017-09-01

    Full Text Available A view prominently expounded is that the interaction between the composition and the volume of public expenditures is directly affected by political, institutional, psephological and ideological indicators. A crucial component of public expenditures, social expenditures play an important role in the economy as they directly and indirectly affect the distribution of income and wealth. Social expenditures aim at reallocating income and wealth unequal distribution. These expenditures comprise cash benefits, direct in-kind provision of goods and services, and tax breaks with social purposes.The aim of this study is to determine the relationship between political structure, i.e. government fragmentation, ideological composition, elections and so on, and the social expenditures in Greece. Employing data from the Comparative Political Dataset (CPDS and the OECD Social Expenditure Database (SOCX, a time series analysis was conducted for Greece for the 1980-2014 period. The findings of the study indicate that voter turnout, spending on the elderly population and the number of government changes have positive and statistically significant effects on social expenditures in Greece while debt stock and cabinet composition have negative effects.

  4. Empirical Analysis on The Existence of The Phillips Curve

    Directory of Open Access Journals (Sweden)

    Shaari Mohd Shahidan

    2018-01-01

    Full Text Available The Phillips curve shows the trade-off relationship between the inflation and unemployment rates. A rise in inflation due to the high economic growth, more jobs are available and therefore unemployment will fall. However, the existence of the Phillips curve in high-income countries has not been much discussed. Countries with high income should have low unemployment rate, suggesting a high inflation. However, some high-income countries, the United States in 1970s for example, could not avert stagflation whereby high unemployment rate and inflation occurred in the same time. This situation is contrary to the Phillips curve. Therefore, this study aims to investigate the existence of the Phillips curve in high-income countries for the period 1990-2014 using the panel data analysis. The most interesting finding of this study is the existence of a bidirectional relationship between unemployment rate and inflation rate in both long and short runs. Therefore, the governments should choose to stabilize inflation rate or reduce unemployment rate

  5. Prominent feature extraction for review analysis: an empirical study

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita

    2016-05-01

    Sentiment analysis (SA) research has increased tremendously in recent times. SA aims to determine the sentiment orientation of a given text into positive or negative polarity. Motivation for SA research is the need for the industry to know the opinion of the users about their product from online portals, blogs, discussion boards and reviews and so on. Efficient features need to be extracted for machine-learning algorithm for better sentiment classification. In this paper, initially various features are extracted such as unigrams, bi-grams and dependency features from the text. In addition, new bi-tagged features are also extracted that conform to predefined part-of-speech patterns. Furthermore, various composite features are created using these features. Information gain (IG) and minimum redundancy maximum relevancy (mRMR) feature selection methods are used to eliminate the noisy and irrelevant features from the feature vector. Finally, machine-learning algorithms are used for classifying the review document into positive or negative class. Effects of different categories of features are investigated on four standard data-sets, namely, movie review and product (book, DVD and electronics) review data-sets. Experimental results show that composite features created from prominent features of unigram and bi-tagged features perform better than other features for sentiment classification. mRMR is a better feature selection method as compared with IG for sentiment classification. Boolean Multinomial Naïve Bayes) algorithm performs better than support vector machine classifier for SA in terms of accuracy and execution time.

  6. Stochastic semi-nonparametric frontier estimation of electricity distribution networks: Application of the StoNED method in the Finnish regulatory model

    International Nuclear Information System (INIS)

    Kuosmanen, Timo

    2012-01-01

    Electricity distribution network is a prime example of a natural local monopoly. In many countries, electricity distribution is regulated by the government. Many regulators apply frontier estimation techniques such as data envelopment analysis (DEA) or stochastic frontier analysis (SFA) as an integral part of their regulatory framework. While more advanced methods that combine nonparametric frontier with stochastic error term are known in the literature, in practice, regulators continue to apply simplistic methods. This paper reports the main results of the project commissioned by the Finnish regulator for further development of the cost frontier estimation in their regulatory framework. The key objectives of the project were to integrate a stochastic SFA-style noise term to the nonparametric, axiomatic DEA-style cost frontier, and to take the heterogeneity of firms and their operating environments better into account. To achieve these objectives, a new method called stochastic nonparametric envelopment of data (StoNED) was examined. Based on the insights and experiences gained in the empirical analysis using the real data of the regulated networks, the Finnish regulator adopted the StoNED method in use from 2012 onwards.

  7. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  8. An Empirical Analysis of the Impact of Capital Market Activities on ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Impact of Capital Market Activities on the Nigerian Economy. ... Others include the expansion of the stock market in terms of depth and breadth and the attraction of foreign direct investment and foreign portfolio investment into the Nigerian economic landscape. Keywords: Nigeria, Market ...

  9. Perception of urban retailing environments : an empirical analysis of consumer information and usage fields

    NARCIS (Netherlands)

    Timmermans, H.J.P.; vd Heijden, R.E.C.M.; Westerveld, J.

    1982-01-01

    This article reports on an empirical analysis of consumer information and usage fields in the city of Eindhoven. The main purposes of this study are to investigate the distance, sectoral and directional biases of these fields, to analyse whether the degree of biases is related to personal

  10. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    Science.gov (United States)

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  11. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    Science.gov (United States)

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  12. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    Science.gov (United States)

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  13. Does risk management contribute to IT project success? A meta-analysis of empirical evidence

    NARCIS (Netherlands)

    de Bakker, K.F.C.; Boonstra, A.; Wortmann, J.C.

    The question whether risk management contributes to IT project success is considered relevant by people from both academic and practitioners' communities already for a long time. This paper presents a meta-analysis of the empirical evidence that either supports or opposes the claim that risk

  14. Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma

    Science.gov (United States)

    Brooks, Lara; Whitacre, Brian E.

    2011-01-01

    Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…

  15. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  16. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  17. Nonparametric Identification and Estimation of Finite Mixture Models of Dynamic Discrete Choices

    OpenAIRE

    Hiroyuki Kasahara; Katsumi Shimotsu

    2006-01-01

    In dynamic discrete choice analysis, controlling for unobserved heterogeneity is an important issue, and finite mixture models provide flexible ways to account for unobserved heterogeneity. This paper studies nonparametric identifiability of type probabilities and type-specific component distributions in finite mixture models of dynamic discrete choices. We derive sufficient conditions for nonparametric identification for various finite mixture models of dynamic discrete choices used in appli...

  18. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  19. Pharmacoeconomic analysis of voriconazole vs. caspofungin in the empirical antifungal therapy of febrile neutropenia in Australia.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Liew, Danny; Stewart, Kay; Kong, David C M

    2012-05-01

    In two major clinical trials, voriconazole and caspofungin were recommended as alternatives to liposomal amphotericin B for empirical use in febrile neutropenia. This study investigated the health economic impact of using voriconazole vs. caspofungin in patients with febrile neutropenia. A decision analytic model was developed to measure downstream consequences of empirical antifungal therapy. Clinical outcomes measured were success, breakthrough infection, persistent base-line infection, persistent fever, premature discontinuation and death. Treatment transition probabilities and patterns were directly derived from data in two relevant randomised controlled trials. Resource use was estimated using an expert clinical panel. Cost inputs were obtained from latest Australian sources. The analysis adopted the perspective of the Australian hospital system. The use of caspofungin led to a lower expected mean cost per patient than voriconazole (AU$40,558 vs. AU$41,356), with a net cost saving of AU$798 (1.9%) per patient. Results were most sensitive to the duration of therapy and the alternative therapy used post-discontinuation. In uncertainty analysis, the cost associated with caspofungin is less than that with voriconazole in 65.5% of cases. This is the first economic evaluation of voriconazole vs. caspofungin for empirical therapy. Caspofungin appears to have a higher probability of having cost-savings than voriconazole for empirical therapy. The difference between the two medications does not seem to be statistically significant however. © 2011 Blackwell Verlag GmbH.

  20. Are PIIGS so different? An empirical analysis of demand and supply shocks

    Directory of Open Access Journals (Sweden)

    Syssoyeva-Masson Irina

    2017-01-01

    Full Text Available This paper analyses responses to supply and demand shocks in PIIGS countries. We compare the results obtained for PIIGS with those of Germany and the USA, and also with those of France, which despite its government’s efforts demonstrate relatively poor recent economic performance. The main objective of this paper is to establish whether it is still reasonable to consider PIIGS as a group apart. Our methodological strategy is based on the Okun Law (OL which is incorporated in a Structural Vector Autoregression (SVAR model with Blanchard-Quah (BQ restrictions. We address two drawbacks that usually present in the OL: the interdependency problem and the non-stationarity problem. By using a non-parametric representation of OL, we identify the heterogeneity between countries. We build stable VAR models for each of the economies and use the BQ SVAR impulses to analyse the importance of contemporary and long-run effects of supply and demand shocks. The main conclusion of this paper is that it does not make any sense today to identify PIIGS as a separate group. Additionally, a country that stands out from our analysis is France. The question can thus be posed that if “PIIGS” signifies “countries with poor economic performances” then should not France also belong to this group?

  1. Teaching Nonparametric Statistics Using Student Instrumental Values.

    Science.gov (United States)

    Anderson, Jonathan W.; Diddams, Margaret

    Nonparametric statistics are often difficult to teach in introduction to statistics courses because of the lack of real-world examples. This study demonstrated how teachers can use differences in the rankings and ratings of undergraduate and graduate values to discuss: (1) ipsative and normative scaling; (2) uses of the Mann-Whitney U-test; and…

  2. Nonparametric conditional predictive regions for time series

    NARCIS (Netherlands)

    de Gooijer, J.G.; Zerom Godefay, D.

    2000-01-01

    Several nonparametric predictors based on the Nadaraya-Watson kernel regression estimator have been proposed in the literature. They include the conditional mean, the conditional median, and the conditional mode. In this paper, we consider three types of predictive regions for these predictors — the

  3. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  4. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  5. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are point...

  6. Nonparametric estimation in models for unobservable heterogeneity

    OpenAIRE

    Hohmann, Daniel

    2014-01-01

    Nonparametric models which allow for data with unobservable heterogeneity are studied. The first publication introduces new estimators and their asymptotic properties for conditional mixture models. The second publication considers estimation of a function from noisy observations of its Radon transform in a Gaussian white noise model.

  7. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.; Lombard, F.

    2012-01-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal

  8. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  9. A Normalization-Free and Nonparametric Method Sharpens Large-Scale Transcriptome Analysis and Reveals Common Gene Alteration Patterns in Cancers.

    Science.gov (United States)

    Li, Qi-Gang; He, Yong-Han; Wu, Huan; Yang, Cui-Ping; Pu, Shao-Yan; Fan, Song-Qing; Jiang, Li-Ping; Shen, Qiu-Shuo; Wang, Xiao-Xiong; Chen, Xiao-Qiong; Yu, Qin; Li, Ying; Sun, Chang; Wang, Xiangting; Zhou, Jumin; Li, Hai-Peng; Chen, Yong-Bin; Kong, Qing-Peng

    2017-01-01

    Heterogeneity in transcriptional data hampers the identification of differentially expressed genes (DEGs) and understanding of cancer, essentially because current methods rely on cross-sample normalization and/or distribution assumption-both sensitive to heterogeneous values. Here, we developed a new method, Cross-Value Association Analysis (CVAA), which overcomes the limitation and is more robust to heterogeneous data than the other methods. Applying CVAA to a more complex pan-cancer dataset containing 5,540 transcriptomes discovered numerous new DEGs and many previously rarely explored pathways/processes; some of them were validated, both in vitro and in vivo , to be crucial in tumorigenesis, e.g., alcohol metabolism ( ADH1B ), chromosome remodeling ( NCAPH ) and complement system ( Adipsin ). Together, we present a sharper tool to navigate large-scale expression data and gain new mechanistic insights into tumorigenesis.

  10. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  11. Are Public-Private Partnerships a Source of Greater Efficiency in Water Supply? Results of a Non-Parametric Performance Analysis Relating to the Italian Industry

    Directory of Open Access Journals (Sweden)

    Corrado lo Storto

    2013-12-01

    Full Text Available This article reports the outcome of a performance study of the water service provision industry in Italy. The study evaluates the efficiency of 21 “private or public-private” equity and 32 “public” equity water service operators and investigates controlling factors. In particular, the influence that the operator typology and service management nature - private vs. public - has on efficiency is assessed. The study employed a two-stage Data Envelopment Analysis methodology. In the first stage, the operational efficiency of water supply operators is calculated by implementing a conventional BCC DEA model, that uses both physical infrastructure and financial input and output variables to explore economies of scale. In the second stage, bootstrapped DEA and Tobit regression are performed to estimate the influence that a number of environmental factors have on water supplier efficiency. The results show that the integrated water provision industry in Italy is characterized by operational inefficiencies of service operators, and scale and agglomeration economies may have a not negligible effect on efficiency. In addition, the operator typology and its geographical location affect efficiency.

  12. The effect of marketing expenses on car sales – an empirical analysis

    Directory of Open Access Journals (Sweden)

    Tudose Mihaela Brînduşa

    2017-01-01

    Full Text Available The paper assesses empirically the relationship between marketing expenditures and sales in a highly competitive industry, namely automotive, by analyzing the marketing expending of Automobile Dacia S.A. The first part of the paper presents the state-of-the-art and discusses the studies previously conducted which focus on the structure, dynamic and the impact of marketing expenses, while the second part consists in an empirical analysis conducted on Automobile Dacia S.A. marketing spending. The results of the study show that the company managed to increase its’ market share by adopting differentiated marketing for each geographical area. Although the research revealed that the allocation percentage from sales for marketing spending is relatively low (5-6%, the analysis conducted on the cost per unit sold reveals a share of 3% on marketing spending.

  13. Empowering Kanban through TPS-Principles - An Empirical Analysis of the Toyota Production System

    OpenAIRE

    Thun , Jörn-Henrik; Drüke , Martin; Grübner , Andre

    2010-01-01

    Abstract The purpose of this paper is the empirical investigation of the Toyota Production System in order to test existing relationships as they are proposed in theory. The underlying model consists of seven factors reflecting the key practices of the Toyota Production System. Using data from 188 manufacturing plants participating in the High Performance Manufacturing research project, the model?s measurement characteristics were validated through confirmatory factor analysis. Pat...

  14. Risk and Protective Factors of Internet Addiction: A Meta-Analysis of Empirical Studies in Korea

    OpenAIRE

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-01-01

    Purpose A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Materials and Methods Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and...

  15. Empirical evidence from an inter-industry descriptive analysis of overall materiality measures

    OpenAIRE

    N. Pecchiari; C. Emby; G. Pogliani

    2013-01-01

    This study presents an empirical cross-industry descriptive analysis of overall quantitative materiality measures. We examine the behaviour of four commonly used quantitative materiality measures within and across industries with respect to their size, relative size and stability, over ten years. The sample consists of large- and medium-sized European companies, representing 24 different industry categories for the years 1998 through 2007 (a total sample of over 36,000 data points). Our resul...

  16. Characterizing Social Interaction in Tobacco-Oriented Social Networks: An Empirical Analysis

    OpenAIRE

    Liang, Yunji; Zheng, Xiaolong; Zeng, Daniel Dajun; Zhou, Xingshe; Leischow, Scott James; Chung, Wingyan

    2015-01-01

    Social media is becoming a new battlefield for tobacco ?wars?. Evaluating the current situation is very crucial for the advocacy of tobacco control in the age of social media. To reveal the impact of tobacco-related user-generated content, this paper characterizes user interaction and social influence utilizing social network analysis and information theoretic approaches. Our empirical studies demonstrate that the exploding pro-tobacco content has long-lasting effects with more active users a...

  17. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  18. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...... expectations algorithm. New canonical variates are found that indicate that the highest correlation between ocean temperature and height is associated with the build-up of the El Niño during the last half of 1997....

  19. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  20. An empirical analysis of the relationship between the consumption of alcohol and liver cirrhosis mortality

    DEFF Research Database (Denmark)

    Bentzen, Jan Børsen; Smith, Valdemar

    The question whether intake of alcohol is associated with liver cirrhosis mortality is analyzed using aggregate data for alcohol consumption, alcohol related diseases and alcohol policies of 16 European countries. The empirical analysis gives support to a close association between cirrhosis morta...... mortality and intake of alcohol - and the latter also concerns each of the specific beverages, i.e. spirits, wine and beer, where other studies usually only find evidence of spirits and wine related to liver cirrhosis mortality.  ...

  1. Nonparametric Change Point Diagnosis Method of Concrete Dam Crack Behavior Abnormality

    Directory of Open Access Journals (Sweden)

    Zhanchao Li

    2013-01-01

    Full Text Available The study on diagnosis method of concrete crack behavior abnormality has always been a hot spot and difficulty in the safety monitoring field of hydraulic structure. Based on the performance of concrete dam crack behavior abnormality in parametric statistical model and nonparametric statistical model, the internal relation between concrete dam crack behavior abnormality and statistical change point theory is deeply analyzed from the model structure instability of parametric statistical model and change of sequence distribution law of nonparametric statistical model. On this basis, through the reduction of change point problem, the establishment of basic nonparametric change point model, and asymptotic analysis on test method of basic change point problem, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is created in consideration of the situation that in practice concrete dam crack behavior may have more abnormality points. And the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is used in the actual project, demonstrating the effectiveness and scientific reasonableness of the method established. Meanwhile, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality has a complete theoretical basis and strong practicality with a broad application prospect in actual project.

  2. A simple non-parametric goodness-of-fit test for elliptical copulas

    Directory of Open Access Journals (Sweden)

    Jaser Miriam

    2017-12-01

    Full Text Available In this paper, we propose a simple non-parametric goodness-of-fit test for elliptical copulas of any dimension. It is based on the equality of Kendall’s tau and Blomqvist’s beta for all bivariate margins. Nominal level and power of the proposed test are investigated in a Monte Carlo study. An empirical application illustrates our goodness-of-fit test at work.

  3. Calibrating a combined energy systems analysis and controller design method with empirical data

    International Nuclear Information System (INIS)

    Murphy, Gavin Bruce; Counsell, John; Allison, John; Brindley, Joseph

    2013-01-01

    The drive towards low carbon constructions has seen buildings increasingly utilise many different energy systems simultaneously to control the human comfort of the indoor environment; such as ventilation with heat recovery, various heating solutions and applications of renewable energy. This paper describes a dynamic modelling and simulation method (IDEAS – Inverse Dynamics based Energy Assessment and Simulation) for analysing the energy utilisation of a building and its complex servicing systems. The IDEAS case study presented in this paper is based upon small perturbation theory and can be used for the analysis of the performance of complex energy systems and also for the design of smart control systems. This paper presents a process of how any dynamic model can be calibrated against a more empirical based data model, in this case the UK Government's SAP (Standard Assessment Procedure). The research targets of this work are building simulation experts for analysing the energy use of a building and also control engineers to assist in the design of smart control systems for dwellings. The calibration process presented is transferable and has applications for simulation experts to assist in calibrating any dynamic building simulation method with an empirical based method. - Highlights: • Presentation of an energy systems analysis method for assessing the energy utilisation of buildings and their complex servicing systems. • An inverse dynamics based controller design method is detailed. • Method of how a dynamic model can be calibrated with an empirical based model

  4. An empirical analysis of the relationship between cost of control activities and project management success

    Directory of Open Access Journals (Sweden)

    Al-Tmeemy Samiaah

    2018-01-01

    Full Text Available To achieve the objectives of continuous improvement programs, construction managers must link the achievement of quality with cost. This paper aims to associate project management success (PMS with cost of control (COC activities in an empirical manner. Thus, the purpose is to determine the extent to which COC activities impact PMS. Quantitative method was adopted to collect data from Malaysian building companies using postal and email surveys. Hypothesis is tested using correlation and simple linear regression analysis. The findings of this study indicate that COC activities are positively associated with the PMS. The empirical evidences obtained from this research, provides financial justification for all quality improvement efforts. This can assist building contractors to enhance the success of project management by reducing the level of business failures due to poor quality, cost overruns, and delays.

  5. Tools for Empirical and Operational Analysis of Mobile Offloading in Loop-Based Applications

    Directory of Open Access Journals (Sweden)

    Alexandru-Corneliu OLTEANU

    2013-01-01

    Full Text Available Offloading for mobile devices is an increasingly popular research topic, matching the popu-larity mobile devices have in the general population. Studying mobile offloading is challenging because of device and application heterogeneity. However, we believe that focusing on a specific type of application can bring advances in offloading for mobile devices, while still keeping a wide range of applicability. In this paper we focus on loop-based applications, in which most of the functionality is given by iterating an execution loop. We model the main loop of the application with a graph that consists of a cycle and propose an operational analysis to study offloading on this model. We also propose a testbed based on a real-world application to empirically evaluate offloading. We conduct performance evaluation using both tools and compare the analytical and empirical results.

  6. Joint production and corporate pricing: An empirical analysis of joint products in the petroleum industry

    International Nuclear Information System (INIS)

    Karimnejad, H.

    1990-01-01

    This dissertation investigates the pricing mechanism of joint products in large multi-plant and multi-product corporations. The primary objective of this dissertation is to show the consistency of classical theories of production with corporate pricing of joint products. This dissertation has two major parts. Part One provides a theoretical framework for joint production and corporate pricing. In this part, joint production is defined and its historical treatment by classical and contemporary economists is analyzed. Part Two conducts an empirical analysis of joint products in the US petroleum industry. Methods of cost allocation are used in the pricing of each individual petroleum product. Three methods are employed to distribute joint production costs to individual petroleum products. These methods are, the sales value method, the barrel gravity method and the average unit cost method. The empirical findings of dissertation provide useful guidelines for pricing policies of large multi-product corporations

  7. Canonical Least-Squares Monte Carlo Valuation of American Options: Convergence and Empirical Pricing Analysis

    Directory of Open Access Journals (Sweden)

    Xisheng Yu

    2014-01-01

    Full Text Available The paper by Liu (2010 introduces a method termed the canonical least-squares Monte Carlo (CLM which combines a martingale-constrained entropy model and a least-squares Monte Carlo algorithm to price American options. In this paper, we first provide the convergence results of CLM and numerically examine the convergence properties. Then, the comparative analysis is empirically conducted using a large sample of the S&P 100 Index (OEX puts and IBM puts. The results on the convergence show that choosing the shifted Legendre polynomials with four regressors is more appropriate considering the pricing accuracy and the computational cost. With this choice, CLM method is empirically demonstrated to be superior to the benchmark methods of binominal tree and finite difference with historical volatilities.

  8. Regulatory reforms and productivity: An empirical analysis of the Japanese electricity industry

    International Nuclear Information System (INIS)

    Nakano, Makiko; Managi, Shunsuke

    2008-01-01

    The Japanese electricity industry has experienced regulatory reforms since the mid-1990s. This article measures productivity in Japan's steam power-generation sector and examines the effect of reforms on the productivity of this industry over the period 1978-2003. We estimate the Luenberger productivity indicator, which is a generalization of the commonly used Malmquist productivity index, using a data envelopment analysis approach. Factors associated with productivity change are investigated through dynamic generalized method of moments (GMM) estimation of panel data. Our empirical analysis shows that the regulatory reforms have contributed to productivity growth in the steam power-generation sector in Japan

  9. A Price Index Model for Road Freight Transportation and Its Empirical analysis in China

    Directory of Open Access Journals (Sweden)

    Liu Zhishuo

    2017-01-01

    Full Text Available The aim of price index for road freight transportation (RFT is to reflect the changes of price in the road transport market. Firstly, a price index model for RFT based on the sample data from Alibaba logistics platform is built. This model is a three levels index system including total index, classification index and individual index and the Laspeyres method is applied to calculate these indices. Finally, an empirical analysis of the price index for RFT market in Zhejiang Province is performed. In order to demonstrate the correctness and validity of the exponential model, a comparative analysis with port throughput and PMI index is carried out.

  10. Analysis of acquisition patterns : A theoretical and empirical evaluation of alternative methods

    NARCIS (Netherlands)

    Paas, LJ; Molenaar, IW

    The order in which consumers acquire nonconsumable products, such as durable and financial products, provides key information for marketing activities, for example, cross-sell lead generation. This paper advocates the desirable features of nonparametric scaling for analyzing acquisition patterns. We

  11. Analysis of acquisition patterns: A theoretical and empirical evaluation of alternative methods

    NARCIS (Netherlands)

    Paas, L.J.; Molenaar, I.W.

    2005-01-01

    The order in which consumers acquire nonconsumable products, such as durable and financial products, provides key information for marketing activities, for example, cross-sell lead generation. This paper advocates the desirable features of nonparametric scaling for analyzing acquisition patterns. We

  12. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  13. Nonparametric Bayes Modeling of Multivariate Categorical Data.

    Science.gov (United States)

    Dunson, David B; Xing, Chuanhua

    2012-01-01

    Modeling of multivariate unordered categorical (nominal) data is a challenging problem, particularly in high dimensions and cases in which one wishes to avoid strong assumptions about the dependence structure. Commonly used approaches rely on the incorporation of latent Gaussian random variables or parametric latent class models. The goal of this article is to develop a nonparametric Bayes approach, which defines a prior with full support on the space of distributions for multiple unordered categorical variables. This support condition ensures that we are not restricting the dependence structure a priori. We show this can be accomplished through a Dirichlet process mixture of product multinomial distributions, which is also a convenient form for posterior computation. Methods for nonparametric testing of violations of independence are proposed, and the methods are applied to model positional dependence within transcription factor binding motifs.

  14. Fast multidimensional ensemble empirical mode decomposition for the analysis of big spatio-temporal datasets.

    Science.gov (United States)

    Wu, Zhaohua; Feng, Jiaxin; Qiao, Fangli; Tan, Zhe-Min

    2016-04-13

    In this big data era, it is more urgent than ever to solve two major issues: (i) fast data transmission methods that can facilitate access to data from non-local sources and (ii) fast and efficient data analysis methods that can reveal the key information from the available data for particular purposes. Although approaches in different fields to address these two questions may differ significantly, the common part must involve data compression techniques and a fast algorithm. This paper introduces the recently developed adaptive and spatio-temporally local analysis method, namely the fast multidimensional ensemble empirical mode decomposition (MEEMD), for the analysis of a large spatio-temporal dataset. The original MEEMD uses ensemble empirical mode decomposition to decompose time series at each spatial grid and then pieces together the temporal-spatial evolution of climate variability and change on naturally separated timescales, which is computationally expensive. By taking advantage of the high efficiency of the expression using principal component analysis/empirical orthogonal function analysis for spatio-temporally coherent data, we design a lossy compression method for climate data to facilitate its non-local transmission. We also explain the basic principles behind the fast MEEMD through decomposing principal components instead of original grid-wise time series to speed up computation of MEEMD. Using a typical climate dataset as an example, we demonstrate that our newly designed methods can (i) compress data with a compression rate of one to two orders; and (ii) speed-up the MEEMD algorithm by one to two orders. © 2016 The Authors.

  15. Network structure exploration via Bayesian nonparametric models

    International Nuclear Information System (INIS)

    Chen, Y; Wang, X L; Xiang, X; Tang, B Z; Bu, J Z

    2015-01-01

    Complex networks provide a powerful mathematical representation of complex systems in nature and society. To understand complex networks, it is crucial to explore their internal structures, also called structural regularities. The task of network structure exploration is to determine how many groups there are in a complex network and how to group the nodes of the network. Most existing structure exploration methods need to specify either a group number or a certain type of structure when they are applied to a network. In the real world, however, the group number and also the certain type of structure that a network has are usually unknown in advance. To explore structural regularities in complex networks automatically, without any prior knowledge of the group number or the certain type of structure, we extend a probabilistic mixture model that can handle networks with any type of structure but needs to specify a group number using Bayesian nonparametric theory. We also propose a novel Bayesian nonparametric model, called the Bayesian nonparametric mixture (BNPM) model. Experiments conducted on a large number of networks with different structures show that the BNPM model is able to explore structural regularities in networks automatically with a stable, state-of-the-art performance. (paper)

  16. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  17. Nonparametric Mixture Models for Supervised Image Parcellation.

    Science.gov (United States)

    Sabuncu, Mert R; Yeo, B T Thomas; Van Leemput, Koen; Fischl, Bruce; Golland, Polina

    2009-09-01

    We present a nonparametric, probabilistic mixture model for the supervised parcellation of images. The proposed model yields segmentation algorithms conceptually similar to the recently developed label fusion methods, which register a new image with each training image separately. Segmentation is achieved via the fusion of transferred manual labels. We show that in our framework various settings of a model parameter yield algorithms that use image intensity information differently in determining the weight of a training subject during fusion. One particular setting computes a single, global weight per training subject, whereas another setting uses locally varying weights when fusing the training data. The proposed nonparametric parcellation approach capitalizes on recently developed fast and robust pairwise image alignment tools. The use of multiple registrations allows the algorithm to be robust to occasional registration failures. We report experiments on 39 volumetric brain MRI scans with expert manual labels for the white matter, cerebral cortex, ventricles and subcortical structures. The results demonstrate that the proposed nonparametric segmentation framework yields significantly better segmentation than state-of-the-art algorithms.

  18. Robustifying Bayesian nonparametric mixtures for count data.

    Science.gov (United States)

    Canale, Antonio; Prünster, Igor

    2017-03-01

    Our motivating application stems from surveys of natural populations and is characterized by large spatial heterogeneity in the counts, which makes parametric approaches to modeling local animal abundance too restrictive. We adopt a Bayesian nonparametric approach based on mixture models and innovate with respect to popular Dirichlet process mixture of Poisson kernels by increasing the model flexibility at the level both of the kernel and the nonparametric mixing measure. This allows to derive accurate and robust estimates of the distribution of local animal abundance and of the corresponding clusters. The application and a simulation study for different scenarios yield also some general methodological implications. Adding flexibility solely at the level of the mixing measure does not improve inferences, since its impact is severely limited by the rigidity of the Poisson kernel with considerable consequences in terms of bias. However, once a kernel more flexible than the Poisson is chosen, inferences can be robustified by choosing a prior more general than the Dirichlet process. Therefore, to improve the performance of Bayesian nonparametric mixtures for count data one has to enrich the model simultaneously at both levels, the kernel and the mixing measure. © 2016, The International Biometric Society.

  19. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    Energy Technology Data Exchange (ETDEWEB)

    Walls, W.D., E-mail: wdwalls@ucalgary.ca [Department of Economics, University of Calgary, 2500 University Drive NW, Calgary, Alberta, T2N 1N4 (Canada); Rusco, Frank; Kendix, Michael [US GAO (United States)

    2011-07-15

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research Highlights: > Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. > This has important implications for motor fuels markets and vehicular emissions. > Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city

  20. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    International Nuclear Information System (INIS)

    Walls, W.D.; Rusco, Frank; Kendix, Michael

    2011-01-01

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research highlights: → Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. → This has important implications for motor fuels markets and vehicular emissions. → Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal.

  1. [Nonparametric method of estimating survival functions containing right-censored and interval-censored data].

    Science.gov (United States)

    Xu, Yonghong; Gao, Xiaohuan; Wang, Zhengxi

    2014-04-01

    Missing data represent a general problem in many scientific fields, especially in medical survival analysis. Dealing with censored data, interpolation method is one of important methods. However, most of the interpolation methods replace the censored data with the exact data, which will distort the real distribution of the censored data and reduce the probability of the real data falling into the interpolation data. In order to solve this problem, we in this paper propose a nonparametric method of estimating the survival function of right-censored and interval-censored data and compare its performance to SC (self-consistent) algorithm. Comparing to the average interpolation and the nearest neighbor interpolation method, the proposed method in this paper replaces the right-censored data with the interval-censored data, and greatly improves the probability of the real data falling into imputation interval. Then it bases on the empirical distribution theory to estimate the survival function of right-censored and interval-censored data. The results of numerical examples and a real breast cancer data set demonstrated that the proposed method had higher accuracy and better robustness for the different proportion of the censored data. This paper provides a good method to compare the clinical treatments performance with estimation of the survival data of the patients. This pro vides some help to the medical survival data analysis.

  2. Exact nonparametric confidence bands for the survivor function.

    Science.gov (United States)

    Matthews, David

    2013-10-12

    A method to produce exact simultaneous confidence bands for the empirical cumulative distribution function that was first described by Owen, and subsequently corrected by Jager and Wellner, is the starting point for deriving exact nonparametric confidence bands for the survivor function of any positive random variable. We invert a nonparametric likelihood test of uniformity, constructed from the Kaplan-Meier estimator of the survivor function, to obtain simultaneous lower and upper bands for the function of interest with specified global confidence level. The method involves calculating a null distribution and associated critical value for each observed sample configuration. However, Noe recursions and the Van Wijngaarden-Decker-Brent root-finding algorithm provide the necessary tools for efficient computation of these exact bounds. Various aspects of the effect of right censoring on these exact bands are investigated, using as illustrations two observational studies of survival experience among non-Hodgkin's lymphoma patients and a much larger group of subjects with advanced lung cancer enrolled in trials within the North Central Cancer Treatment Group. Monte Carlo simulations confirm the merits of the proposed method of deriving simultaneous interval estimates of the survivor function across the entire range of the observed sample. This research was supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada. It was begun while the author was visiting the Department of Statistics, University of Auckland, and completed during a subsequent sojourn at the Medical Research Council Biostatistics Unit in Cambridge. The support of both institutions, in addition to that of NSERC and the University of Waterloo, is greatly appreciated.

  3. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  4. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    International Nuclear Information System (INIS)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L.

    2008-01-01

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  5. Federalism and regional health care expenditures: an empirical analysis for the Swiss cantons.

    Science.gov (United States)

    Crivelli, Luca; Filippini, Massimo; Mosca, Ilaria

    2006-05-01

    Switzerland (7.2 million inhabitants) is a federal state composed of 26 cantons. The autonomy of cantons and a particular health insurance system create strong heterogeneity in terms of regulation and organisation of health care services. In this study we use a single-equation approach to model the per capita cantonal expenditures on health care services and postulate that per capita health expenditures depend on some economic, demographic and structural factors. The empirical analysis demonstrates that a larger share of old people tends to increase health costs and that physicians paid on a fee-for-service basis swell expenditures, thus highlighting a possible phenomenon of supply-induced demand.

  6. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases....... Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual...

  7. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    Energy Technology Data Exchange (ETDEWEB)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L. [Department of Economics, University of South Africa, P.O.Box 392, Pretoria 0003 (South Africa)

    2008-11-15

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  8. Population density and efficiency in energy consumption: An empirical analysis of service establishments

    International Nuclear Information System (INIS)

    Morikawa, Masayuki

    2012-01-01

    This study, using novel establishment-level microdata from the Energy Consumption Statistics, empirically analyzes the effect of urban density on energy intensity in the service sector. According to the analysis, the efficiency of energy consumption in service establishments is higher for densely populated cities. Quantitatively, after controlling for differences among industries, energy efficiency increases by approximately 12% when the density in a municipality population doubles. This result suggests that, given a structural transformation toward the service economy, deregulation of excessive restrictions hindering urban agglomeration, and investment in infrastructure in city centers would contribute to environmentally friendly economic growth.

  9. Empirical Requirements Analysis for Mars Surface Operations Using the Flashline Mars Arctic Research Station

    Science.gov (United States)

    Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)

    2001-01-01

    Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.

  10. An Empirical Analysis of the Changing Role of the German Bundesbank after 1983

    DEFF Research Database (Denmark)

    Juselius, Katarina

    and expansion or contraction of money supply had the expected effect on prices, income, and interest rates. After 1983, the conventional mechanisms no longer seemed to work. The empirical analysis pointed to the crucial role of the bond rate in the system, particularly for the more recent period......A cointegrated VAR model describing a small macroeconomic system consisting of money, income, prices, and interest rates is estimated on split sample data before and after 1983. The monetary mechanisms were found to be significantly different. Before 1983, the money supply seemed controlable...

  11. The incident of repetitive demands resolution in consumer affairs: empirical analysis of legal feasibility

    Directory of Open Access Journals (Sweden)

    Lucas do Monte Silva

    2017-05-01

    Full Text Available Faced with the scenario of massification of lawsuits, this article intends to analyze the main arguments and questionings of the demands related to moral damage and health plans, on Santa Catarina’s Court of Justice, in order to analyze the possible application of the incident of repetitive demands resolution of the new Civil Procedure Code. To do so, it will be done, first, an analysis of the current context of the Brazilian judiciary, presenting the context of repetitive demands and massification of contracts and introductory aspects of the incident of repetitive demands resolution. Then it will made be a judicial empirical analysis, quantitative and qualitative, through a case study of Santa Catarina Courts of Justice, conducting an empirical study of cross descriptive analysis of the demands related to the issue highlighted above, in order to demonstrate an 'argumentative radiography’ of the judgments of that Court. The results confirmed the possibility of applying IRDR in repetitive demands relating to subjects of this study, with due legal caution, taking into account the high number of “issues of fact” that involve lawsuits that have, among their claims, compensation for moral damages.

  12. An empirical examination of restructured electricity prices

    International Nuclear Information System (INIS)

    Knittel, C.R.; Roberts, M.R.

    2005-01-01

    We present an empirical analysis of restructured electricity prices. We study the distributional and temporal properties of the price process in a non-parametric framework, after which we parametrically model the price process using several common asset price specifications from the asset-pricing literature, as well as several less conventional models motivated by the peculiarities of electricity prices. The findings reveal several characteristics unique to electricity prices including several deterministic components of the price series at different frequencies. An 'inverse leverage effect' is also found, where positive shocks to the price series result in larger increases in volatility than negative shocks. We find that forecasting performance in dramatically improved when we incorporate features of electricity prices not commonly modelled in other asset prices. Our findings have implications for how empiricists model electricity prices, as well as how theorists specify models of energy pricing. (author)

  13. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  14. Empirical analysis of persistence and dependence patterns among the capital markets

    Czech Academy of Sciences Publication Activity Database

    Vošvrda, Miloslav

    2006-01-01

    Roč. 15, č. 3 (2006), s. 231-243 ISSN 1210-0455 R&D Projects: GA ČR GA402/05/0115 Grant - others:GA UK(CZ) 454/2004/A-EK FSV Institutional research plan: CEZ:AV0Z10750506 Keywords : dependence structure * non-parametric univariate * multivariate measures of the shock persistence Subject RIV: AH - Economics

  15. Minimizing the trend effect on detrended cross-correlation analysis with empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhao Xiaojun; Shang Pengjian; Zhao Chuang; Wang Jing; Tao Rui

    2012-01-01

    Highlights: ► Investigate the effects of linear, exponential and periodic trends on DCCA. ► Apply empirical mode decomposition to extract trend term. ► Strong and monotonic trends are successfully eliminated. ► Get the cross-correlation exponent in a persistent behavior without crossover. - Abstract: Detrended cross-correlation analysis (DCCA) is a scaling method commonly used to estimate long-range power law cross-correlation in non-stationary signals. However, the susceptibility of DCCA to trends makes the scaling results difficult to analyze due to spurious crossovers. We artificially generate long-range cross-correlated signals and systematically investigate the effect of linear, exponential and periodic trends. Specifically to the crossovers raised by trends, we apply empirical mode decomposition method which decomposes underlying signals into several intrinsic mode functions (IMF) and a residual trend. After the removal of residual term, strong and monotonic trends such as linear and exponential trends are successfully eliminated. But periodic trend cannot be separated out according to the criterion of IMF, which can be eliminated by Fourier transform. As a special case of DCCA, detrended fluctuation analysis presents similar results.

  16. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    Science.gov (United States)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  17. A Nonparametric Test for Seasonal Unit Roots

    OpenAIRE

    Kunst, Robert M.

    2009-01-01

    Abstract: We consider a nonparametric test for the null of seasonal unit roots in quarterly time series that builds on the RUR (records unit root) test by Aparicio, Escribano, and Sipols. We find that the test concept is more promising than a formalization of visual aids such as plots by quarter. In order to cope with the sensitivity of the original RUR test to autocorrelation under its null of a unit root, we suggest an augmentation step by autoregression. We present some evidence on the siz...

  18. Partnership effectiveness in primary community care networks: A national empirical analysis of partners' coordination infrastructure designs.

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Lin, Yung-Kai; Lin, Cheng-Chieh

    2010-01-01

    Previous empirical and managerial studies have ignored the effectiveness of integrated health networks. It has been argued that the varying definitions and strategic imperatives of integrated organizations may have complicated the assessment of the outcomes/performance of varying models, particularly when their market structures and contexts differed. This study aimed to empirically verify a theoretical perspective on the coordination infrastructure designs and the effectiveness of the primary community care networks (PCCNs) formed and funded by the Bureau of National Health Insurance since March 2003. The PCCNs present a model to replace the traditional fragmented providers in Taiwan's health care. The study used a cross-sectional mailed survey designed to ascertain partnership coordination infrastructure and integration of governance, clinical care, bonding, finances, and information. The outcome indicators were PCCNs' perceived performance and willingness to remain within the network. Structural equation modeling examined the causal relationships, controlling for organizational and environmental factors. Primary data collection occurred from February through December 2005, via structured questionnaires sent to 172 PCCNs. Using the individual PCCN as the unit of analysis, the results found that a network's efforts regarding coordination infrastructures were positively related to the PCCN's perceived performance and willingness to remain within the network. In addition, PCCNs practicing in rural areas and in areas with higher density of medical resources had better perceived effectiveness and willingness to cooperate in the network.Practical Implication: The lack of both an operational definition and an information about system-wide integration may have obstructed understanding of integrated health networks' organizational dynamics. This study empirically examined individual PCCNs and offers new insights on how to improve networks' organizational design and

  19. Review of the human reliability analysis performed for Empire State Electric Energy Research Corporation

    International Nuclear Information System (INIS)

    Swart, D.; Banz, I.

    1985-01-01

    The Empire State Electric Energy Research Corporation (ESEERCO) commissioned Westinghouse to conduct a human reliability analysis to identify and quantify human error probabilities associated with operator actions for four specific events which may occur in light water reactors: loss of coolant accident, steam generator tube rupture, steam/feed line break, and stuck open pressurizer spray valve. Human Error Probabilities (HEPs) derived from Swain's Technique for Human Error Rate Prediction (THERP) were compared to data obtained from simulator exercises. A correlation was found between the HEPs derived from Swain and the results of the simulator data. The results of this study provide a unique insight into human factors analysis. The HEPs obtained from such probabilistic studies can be used to prioritize scenarios for operator training situations, and thus improve the correlation between simulator exercises and real control room experiences

  20. Decoupling Economic Growth and Energy Use. An Empirical Cross-Country Analysis for 10 Manufacturing Sectors

    Energy Technology Data Exchange (ETDEWEB)

    Mulder, P. [International Institute for Applied Systems Analysis, Laxenburg (Austria); De Groot, H.L.F. [Faculty of Economics and Business Administration, Vrije Universiteit, Amsterdam (Netherlands)

    2004-07-01

    This paper provides an empirical analysis of decoupling economic growth and energy use and its various determinants by exploring trends in energy- and labour productivity across 10 manufacturing sectors and 14 OECD countries for the period 1970-1997. We explicitly aim to trace back aggregate developments in the manufacturing sector to developments at the level of individual subsectors. A cross-country decomposition analysis reveals that in some countries structural changes contributed considerably to aggregate manufacturing energy-productivity growth and, hence, to decoupling, while in other countries they partly offset energy-efficiency improvements. In contrast, structural changes only play a minor role in explaining aggregate manufacturing labour-productivity developments. Furthermore, we find labour-productivity growth to be higher on average than energy-productivity growth. Over time, this bias towards labour-productivity growth is increasing in the aggregate manufacturing sector, while it is decreasing in most manufacturing subsectors.

  1. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    International Nuclear Information System (INIS)

    Wang Wen-Bo; Zhang Xiao-Dong; Chang Yuchan; Wang Xiang-Li; Wang Zhao; Chen Xi; Zheng Lei

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. (paper)

  2. Linear and nonlinear determinants of the performance of informal venture capitalists’ investments. An empirical analysis

    Directory of Open Access Journals (Sweden)

    Vincenzo Capizzi

    2013-05-01

    Full Text Available This paper is aimed at identifying and analyzing the contribution of the major drivers of the performance of informal venture capitalists’ investments. This study analyzes data on Italian transactions and personal features of Italian Business Angels gathered during 2007 – 2011 with the support of IBAN (Italian Business Angels Network. The econometric analysis investigates the returns of business angels’ investments and their major determinants (industry, exit strategy, experience, holding period, rejection rate, and year of divestiture. The major results are the followings: 1 differently from previous literature, the relationship between Experience and IRR is quadratic and significant; 2 for the first time, is confirmed by quantitative data that short Holding period (below 3 years earn a lower IRR; 3 the Rejection rate is logarithmic and the impact on IRR is positive and significant. Finally, the outcomes of the empirical analysis performed in this study allow identifying new and concrete insights on possible policy interventions.

  3. Updating an empirical analysis on the proton’s central opacity and asymptotia

    International Nuclear Information System (INIS)

    Fagundes, D A; Menon, M J; Silva, P V R G

    2016-01-01

    We present an updated empirical analysis on the ratio of the elastic (integrated) to the total cross section in the c.m. energy interval from 5 GeV to 8 TeV. As in a previous work, we use a suitable analytical parametrization for that ratio (depending on only four free fit parameters) and investigate three asymptotic scenarios: either the black disk limit or scenarios above or below that limit. The dataset includes now the datum at 7 TeV, recently reported by the ATLAS Collaboration. Our analysis favors, once more, a scenario below the black disk, providing an asymptotic ratio consistent with the rational value 1/3, namely a gray disk limit. Upper bounds for the ratio of the diffractive (dissociative) to the inelastic cross section are also presented. (paper)

  4. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...... instantaneous and convolutive mixing, and the inferred temporal patterns. Spatial maps are seen to capture smooth and localized stimuli-related components, and often identifiable noise components. The implementation is freely available as a GUI/SPM plugin, and we recommend using GPICA as an additional tool when...

  5. The Impact of Tourism on Economic Growth in the Western Balkan Countries: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Prof. Dr Nasir Selimi

    2017-06-01

    Full Text Available Purpose: The purpose of this research paper is to empirically analyse the effects of tourism on economic growth in Western Balkan countries (Albania, Bosnia and Herzegovina, Croatia, FYROM, Montenegro and Serbia. Design/Methodology/Approach: The empirical analysis consists of 17-year panel data of 6 countries over the period 1998 to 2014. Several models are analysed using the panel regression econometric techniques. The study investigates the random and fixed effects, as well as individual heterogeneity across those countries. Also, the Hausman Taylor IV estimator is used as the most appropriate model for this analysis. The real income per capita of the sample countries is modelled as dependent on the lagged income per capita, tourist arrivals, tourism receipts, FDI stock, exports and government expenditures. Findings: The estimation results in all types of models, and indicate that tourism has a positive and significant impact on economic growth in the Western Balkan countries. The Hausman Taylor IV model suggests that for every 1% increase of tourist arrivals, the output will increase approximately by 0.08%. Research limitations/implications: Although the Hausman Taylor IV model performs well, the results should be interpreted with caution. The analysis has its limitations; firstly, the total number of observations is relatively small for a panel regression analysis; secondly, the problem of endogenity is not completely avoided. However, the study implies that these countries should enhance efforts for joint tourism sector policies to engender economic sustainability. Originality/Value: To our best knowledge, this is the first attempt of estimating the effects of tourism on economic growth in the Western Balkan countries using the Hausman Taylor IV model.

  6. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  7. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  8. Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.

    Science.gov (United States)

    Ashley, Laura; Armitage, Gerry

    2010-12-01

    To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.

  9. Identifying ideal brow vector position: empirical analysis of three brow archetypes.

    Science.gov (United States)

    Hamamoto, Ashley A; Liu, Tiffany W; Wong, Brian J

    2013-02-01

    Surgical browlifts counteract the effects of aging, correct ptosis, and optimize forehead aesthetics. While surgeons have control over brow shape, the metrics defining ideal brow shape are subjective. This study aims to empirically determine whether three expert brow design strategies are aesthetically equivalent by using expert focus group analysis and relating these findings to brow surgery. Comprehensive literature search identified three dominant brow design methods (Westmore, Lamas and Anastasia) that are heavily cited, referenced or internationally recognized in either medical literature or by the lay media. Using their respective guidelines, brow shape was modified for 10 synthetic female faces, yielding 30 images. A focus group of 50 professional makeup artists ranked the three images for each of the 10 faces to generate ordinal attractiveness scores. The contemporary methods employed by Anastasia and Lamas produce a brow arch more lateral than Westmore's classic method. Although the more laterally located brow arch is considered the current trend in facial aesthetics, this style was not empirically supported. No single method was consistently rated most or least attractive by the focus group, and no significant difference in attractiveness score for the different methods was observed (p = 0.2454). Although each method of brow placement has been promoted as the "best" approach, no single brow design method achieved statistical significance in optimizing attractiveness. Each can be used effectively as a guide in designing eyebrow shape during browlift procedures, making it possible to use the three methods interchangeably. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. Ancient DNA analysis suggests negligible impact of the Wari Empire expansion in Peru's central coast during the Middle Horizon

    OpenAIRE

    Valverde, G.; Romero, M.; Espinoza, I.; Cooper, A.; Fehren-Schmitz, L.; Llamas, B.; Haak, W.

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650?1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, wh...

  11. Nonparametric Bayesian density estimation on manifolds with applications to planar shapes.

    Science.gov (United States)

    Bhattacharya, Abhishek; Dunson, David B

    2010-12-01

    Statistical analysis on landmark-based shape spaces has diverse applications in morphometrics, medical diagnostics, machine vision and other areas. These shape spaces are non-Euclidean quotient manifolds. To conduct nonparametric inferences, one may define notions of centre and spread on this manifold and work with their estimates. However, it is useful to consider full likelihood-based methods, which allow nonparametric estimation of the probability density. This article proposes a broad class of mixture models constructed using suitable kernels on a general compact metric space and then on the planar shape space in particular. Following a Bayesian approach with a nonparametric prior on the mixing distribution, conditions are obtained under which the Kullback-Leibler property holds, implying large support and weak posterior consistency. Gibbs sampling methods are developed for posterior computation, and the methods are applied to problems in density estimation and classification with shape-based predictors. Simulation studies show improved estimation performance relative to existing approaches.

  12. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  13. Empirical Analysis and Characterization of Indoor GPS Signal Fading and Multipath Conditions

    DEFF Research Database (Denmark)

    Blunck, Henrik; Kjærgaard, Mikkel Baun; Godsk, Torben

    2009-01-01

    of earlier measurement campaigns to characterize GNSS signal conditions in indoor environments have been published prominently in the GNSS research literature, see, e.g. [1,2,3]. To allow for in-depth signal analysis, these campaigns use a variety of measuring machinery such as channel sounders, mobile...... signal generators and spectrum analyzers. Furthermore, the use-case-specific usability of GPS as an indoor positioning technology as been evaluated empirically on a higher level, see, e.g. [4]. In this paper we present results of a measurement campaign, designed to characterize indoor GNSS signal...... conditions. The work presented can therefore be seen as an effort to the campaigns mentioned above. As the focus of our work lies on the real-world usability of current GNSS technology for indoor use, we employ in our measurement campaign foremost commercial receivers with features, typical for the use cases...

  14. Competencies in Higher Education System: an Empirical Analysis of Employers` Perceptions

    Directory of Open Access Journals (Sweden)

    Adela Deaconu

    2014-08-01

    Full Text Available This study offers insight into the European Qualifications Framework (EQF, as agreed and detailed by the Romanian qualifications framework, applied to the economic sector. By means of a survey conducted on 92 employing companies, it validates the importance of competencies for the Romanian labour market and employers` degree of satisfaction with the competencies of business graduates. In terms of typology, employers attach more importance to transversal competencies than to professional competencies, both at conceptual level and as degree of acquirement following higher education. The empirical analysis provides data on employers` ranking of transversal and professional competencies and identifies the classes of competencies deemed to be in short supply on the labour market. Through its results, the study enhances the relationship between the higher education system and the labour market, providing key information for an efficient implementation of the competence-based education system.

  15. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    Directory of Open Access Journals (Sweden)

    JeongYeon Kim

    2014-01-01

    Full Text Available This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm’s financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  16. Empirical Analysis of Intonation Activities in EFL Student’s Books

    Directory of Open Access Journals (Sweden)

    Dušan Nikolić

    2018-05-01

    Full Text Available Intonation instruction has repeatedly proved a challenge for EFL teachers, who avoid getting involved in intonation teaching more than their EFL textbooks demand from them. Since a great number of teachers rely on EFL textbooks when implementing intonation practice, the intonation activities in EFL materials are often central to their classroom. Even though the research on intonation instruction has been well-documented, few papers have explored intonation activities in EFL materials. The present study thus provides an empirical analysis of intonation activities in five EFL student’s books series by exploring the overall coverage of intonation activities across the series and the quality of these activities. The results reveal that intonation activities are underrepresented in the EFL student’s books, and that discourse intonation deserves more attention in the activities. Considerations for EFL teachers and publishers are also discussed.

  17. An empirical analysis оn logistics performance and the global competitiveness

    Directory of Open Access Journals (Sweden)

    Turkay Yildiz

    2017-05-01

    Full Text Available Logistics has been identified as an area to build cost and service advantages. Therefore, companies are more focused on customer needs and trying to find ways to reduce costs, improve quality and meet the growing expectations of their clients. Indeed, the global competition has led managers to begin to address the issue of providing more efficient logistics services. In this regard, this paper presents an empirical study on logistics performance and global competitiveness. This paper also identifies the associations between logistics performances and global competitiveness. The results indicate that some variables in global competitiveness scores contribute much higher to the logistics performances than the other variables through analysis. The variables that contribute much higher than the other variables to the logistics performances are shown.

  18. Niche Overlap and Discrediting Acts: An Empirical Analysis of Informing in Hollywood

    Directory of Open Access Journals (Sweden)

    Giacomo Negro

    2015-06-01

    Full Text Available This article examines informing on others as a discrediting act between individual agents in a labor market. We conduct an empirical analysis of artists called to testify during the 1950s Congressional hearings into Communism in Hollywood, and multi-level regression models reveal that the odds of an artist informing on another increase when their career histories are more similar. The similarity reflects levels of niche overlap in the labor market. The finding that similarity contributes to discredit in the context of resource competition is compatible with a social comparison process, whereby uncertainty about performance leads more similar people to attend to and exclude one another to a greater extent.

  19. Empirical mode decomposition and Hilbert transforms for analysis of oil-film interferograms

    International Nuclear Information System (INIS)

    Chauhan, Kapil; Ng, Henry C H; Marusic, Ivan

    2010-01-01

    Oil-film interferometry is rapidly becoming the preferred method for direct measurement of wall shear stress in studies of wall-bounded turbulent flows. Although being widely accepted as the most accurate technique, it does have inherent measurement uncertainties, one of which is associated with determining the fringe spacing. This is the focus of this paper. Conventional analysis methods involve a certain level of user input and thus some subjectivity. In this paper, we consider empirical mode decomposition (EMD) and the Hilbert transform as an alternative tool for analyzing oil-film interferograms. In contrast to the commonly used Fourier-based techniques, this new method is less subjective and, as it is based on the Hilbert transform, is superior for treating amplitude and frequency modulated data. This makes it particularly robust to wide differences in the quality of interferograms

  20. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    Science.gov (United States)

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  1. Empirical Analysis Concerning the Correlation Fiscality Rate – Tax Incomes in Romania

    Directory of Open Access Journals (Sweden)

    Raluca Drãcea

    2009-08-01

    Full Text Available In the specialized literature it is reviewed the taxation from all points of view and the question raised by the last decade analysts is: what is the optimum level of taxation? The difficulty in answering to this question stands in the opposite interests: State wants a high level of taxation due to the increasing trend of public expenses while the tax payers wants a low level in order to benefit of greater financial funds.Starting from Leffer theory, the objective of this paper is the empirical analysis of the correlation between fiscality rate and the tax incomes in Romania, using Matlab programand SPSS software. The paper is structured in three parts: first part it is review the specialized literature, in the second part is described the research methodology while the third part compound results and discussions. The paper is finished by conclusions.

  2. An Empirical Analysis of Economic and Socio-Demographic Determinants of Entrepreneurship Across German Regions

    Directory of Open Access Journals (Sweden)

    Mrożewski Matthias

    2014-11-01

    Full Text Available Entrepreneurship is fundamental for a country's economic development through its positive effect on innovation, productivity growth, and job creation. In entrepreneurial research, one of the most important problems is to define the factors that actually determine entrepreneurial action. This study analyzes that question in the case of Germany by taking an aggregated approach that focuses on socio-demographic and economic determinants of regional entrepreneurship. Based on a literature review of German and international regional-level research, six hypotheses are developed and empirically tested using the most recent available data on 385 German regions as units of analysis. The results are surprising. In the case of household income, unemployment, education and marital status the relationship is significant but contrary to earlier research. Only regional age structure seems to be a stable predictor of regional entrepreneurship. The results indicate that in recent years there was a major shift in the determinants and characteristics of entrepreneurship in Germany.

  3. Energy Taxes as a Signaling Device: An Empirical Analysis of Consumer Preferences

    International Nuclear Information System (INIS)

    Ghalwash, Tarek

    2004-01-01

    This paper presents an econometric study dealing with household demand in Sweden. The main objective is to empirically examine the differences in consumer reaction to the introduction of, or the change, in environmental taxes. Main focus is on environmental taxes as a signaling device. The hypothesis is that the introduction of an environmental tax provides new information about the properties of the directly taxed goods. This in turn may affect consumer preferences for these goods, hence altering the consumption choice. The result from the econometric analysis shows that all goods have negative own-price elasticities, and positive income elasticities. Concerning the signalling effect of environmental taxes the results are somewhat ambiguous. The tax elasticity for energy goods used for heating seems to be significantly higher than the traditional price elasticity, whereas the opposite seems to be the case for energy goods used for transportation

  4. Energy taxes as a signaling device: An empirical analysis of consumer preferences

    International Nuclear Information System (INIS)

    Ghalwash, Tarek

    2007-01-01

    This paper presents an econometric study dealing with household demand in Sweden. The main objective is to empirically examine the differences in consumer reaction to the introduction of, or the change, in environmental taxes. Main focus is on environmental taxes as a signaling device. The hypothesis is that the introduction of an environmental tax provides new information about the properties of the directly taxed goods. This in turn may affect consumer preferences for these goods, hence altering the consumption choice. The result from the econometric analysis shows that all goods have negative own-price elasticities, and positive income elasticities. Concerning the signalling effect of environmental taxes the results are somewhat ambiguous. The tax elasticity for energy goods used for heating seems to be significantly higher than the traditional price elasticity, whereas the opposite seems to be the case for energy goods used for transportation

  5. A Meta-Analysis of Empirically Tested School-Based Dating Violence Prevention Programs

    Directory of Open Access Journals (Sweden)

    Sarah R. Edwards

    2014-05-01

    Full Text Available Teen dating violence prevention programs implemented in schools and empirically tested were subjected to meta-analysis. Eight studies met criteria for inclusion, consisting of both within and between designs. Overall, the weighted mean effect size (ES across studies was significant, ESr = .11; 95% confidence interval (CI = [.08, .15], p < .0001, showing an overall positive effect of the studied prevention programs. However, 25% of the studies showed an effect in the negative direction, meaning students appeared to be more supportive of dating violence after participating in a dating violence prevention program. This heightens the need for thorough program evaluation as well as the need for decision makers to have access to data about the effectiveness of programs they are considering implementing. Further implications of the results and recommendations for future research are discussed.

  6. Windfall profit in portfolio diversification? An empirical analysis of the potential benefits of renewable energy investments

    Energy Technology Data Exchange (ETDEWEB)

    Bruns, Frederik

    2013-05-01

    Modern Portfolio Theory is a theory which was introduced by Markowitz, and which suggests the building of a portfolio with assets that have low or, in the best case, negative correlation. In times of financial crises, however, the positive diversification effect of a portfolio can fail when Traditional Assets are highly correlated. Therefore, many investors search for Alternative Asset classes, such as Renewable Energies, that tend to perform independently from capital market performance. 'Windfall Profit in Portfolio Diversification?' discusses the potential role of Renewable Energy investments in an institutional investor's portfolio by applying the main concepts from Modern Portfolio Theory. Thereby, the empirical analysis uses a unique data set from one of the largest institutional investors in the field of Renewable Energies, including several wind and solar parks. The study received the Science Award 2012 of the German Alternative Investments Association ('Bundesverband Alternative Investments e.V.').

  7. A local non-parametric model for trade sign inference

    Science.gov (United States)

    Blazejewski, Adam; Coggins, Richard

    2005-03-01

    We investigate a regularity in market order submission strategies for 12 stocks with large market capitalization on the Australian Stock Exchange. The regularity is evidenced by a predictable relationship between the trade sign (trade initiator), size of the trade, and the contents of the limit order book before the trade. We demonstrate this predictability by developing an empirical inference model to classify trades into buyer-initiated and seller-initiated. The model employs a local non-parametric method, k-nearest neighbor, which in the past was used successfully for chaotic time series prediction. The k-nearest neighbor with three predictor variables achieves an average out-of-sample classification accuracy of 71.40%, compared to 63.32% for the linear logistic regression with seven predictor variables. The result suggests that a non-linear approach may produce a more parsimonious trade sign inference model with a higher out-of-sample classification accuracy. Furthermore, for most of our stocks the observed regularity in market order submissions seems to have a memory of at least 30 trading days.

  8. Empirical evidence about inconsistency among studies in a pair‐wise meta‐analysis

    Science.gov (United States)

    Turner, Rebecca M.; Higgins, Julian P. T.

    2015-01-01

    This paper investigates how inconsistency (as measured by the I2 statistic) among studies in a meta‐analysis may differ, according to the type of outcome data and effect measure. We used hierarchical models to analyse data from 3873 binary, 5132 continuous and 880 mixed outcome meta‐analyses within the Cochrane Database of Systematic Reviews. Predictive distributions for inconsistency expected in future meta‐analyses were obtained, which can inform priors for between‐study variance. Inconsistency estimates were highest on average for binary outcome meta‐analyses of risk differences and continuous outcome meta‐analyses. For a planned binary outcome meta‐analysis in a general research setting, the predictive distribution for inconsistency among log odds ratios had median 22% and 95% CI: 12% to 39%. For a continuous outcome meta‐analysis, the predictive distribution for inconsistency among standardized mean differences had median 40% and 95% CI: 15% to 73%. Levels of inconsistency were similar for binary data measured by log odds ratios and log relative risks. Fitted distributions for inconsistency expected in continuous outcome meta‐analyses using mean differences were almost identical to those using standardized mean differences. The empirical evidence on inconsistency gives guidance on which outcome measures are most likely to be consistent in particular circumstances and facilitates Bayesian meta‐analysis with an informative prior for heterogeneity. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26679486

  9. Network Analysis Approach to Stroke Care and Assistance Provision: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Szczygiel Nina

    2017-06-01

    Full Text Available To model and analyse stroke care and assistance provision in the Portuguese context from the network perspective. We used the network theory as a theoretical foundation for the study. The model proposed by Frey et al. (2006 was used to elicit and comprehend possible interactions and relations between organisations expected to be involved in the provision of care and assistance to stroke patients in their pathway to rehabilitation. Providers were identified and contacted to evaluate the nature and intensity of relationships. Network analysis was performed with the NodeXL software package. Analysis of 509 entities based on about 260 000 entries indicates that stroke care provision in the evaluated context is best captured in the coalition-collaboration setting, which appears to best demonstrate the character of the network. Information from analysis of the collaboration stage was not sufficient to determine the network dynamics. Application of the network theory to understand interorganisational dynamics of the complex health care context. Empirical validation of the model proposed by Frey et al. (2006 in terms of its operationalisation and the way it actually reflects the practical context. Examination and analysis of interorganisational relationships and its contribution to management of compound health care context involving actors from various sectors.

  10. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  11. Nonparametric predictive pairwise comparison with competing risks

    International Nuclear Information System (INIS)

    Coolen-Maturi, Tahani

    2014-01-01

    In reliability, failure data often correspond to competing risks, where several failure modes can cause a unit to fail. This paper presents nonparametric predictive inference (NPI) for pairwise comparison with competing risks data, assuming that the failure modes are independent. These failure modes could be the same or different among the two groups, and these can be both observed and unobserved failure modes. NPI is a statistical approach based on few assumptions, with inferences strongly based on data and with uncertainty quantified via lower and upper probabilities. The focus is on the lower and upper probabilities for the event that the lifetime of a future unit from one group, say Y, is greater than the lifetime of a future unit from the second group, say X. The paper also shows how the two groups can be compared based on particular failure mode(s), and the comparison of the two groups when some of the competing risks are combined is discussed

  12. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  13. Comparative empirical analysis of temporal relationships between construction investment and economic growth in the United States

    Directory of Open Access Journals (Sweden)

    Navid Ahmadi

    2017-09-01

    Full Text Available The majority of policymakers believe that investments in construction infrastructure would boost the economy of the United States (U.S.. They also assume that construction investment in infrastructure has similar impact on the economies of different U.S. states. In contrast, there have been studies showing the negative impact of construction activities on the economy. However, there has not been any research attempt to empirically test the temporal relationships between construction investment and economic growth in the U.S. states, to determine the longitudinal impact of construction investment on the economy of each state. The objective of this study is to investigate whether Construction Value Added (CVA is the leading (or lagging indicator of real Gross Domestic Product (real GDP for every individual state of the U.S. using empirical time series tests. The results of Granger causality tests showed that CVA is a leading indicator of state real GDP in 18 states and the District of Columbia; real GDP is a leading indicator of CVA in 10 states and the District of Columbia. There is a bidirectional relationship between CVA and real GDP in 5 states and the District of Columbia. In 8 states and the District of Columbia, not only do CVA and real GDP have leading/lagging relationships, but they are also cointegrated. These results highlight the important role of the construction industry in these states. The results also show that leading (or lagging lengths vary for different states. The results of the comparative empirical analysis reject the hypothesis that CVA is a leading indicator of real GDP in the states with the highest shares of construction in the real GDP. The findings of this research contribute to the state of knowledge by quantifying the temporal relationships between construction investment and economic growth in the U.S. states. It is expected that the results help policymakers better understand the impact of construction investment

  14. Double-dividend analysis with SCREEN: an empirical study for Switzerland

    International Nuclear Information System (INIS)

    Frei, Christoph W.; Haldi, Pierre-Andre; Sarlos, Gerard

    2005-01-01

    This paper presents an empirical study that quantifies the effects of an ecological fiscal reform as recently rejected by the Swiss population. The measure aims to encourage employment and, at the same time, to dissuade from an excessive energy use and thereby decrease energy-induced external costs (CO 2 , etc.). The analysis is based on the model SCREEN, a general equilibrium model using the complementarity format for the hybrid description of economy-wide production possibilities where the electricity sector is represented by a bottom-up activity analysis and the other production sectors are characterised by top-down production functions. A dynamic formulation of the activity analysis of technologies allows for the reproduction of endogenous structural change (see Frei, C.W., Haldi, P.-A., Sarlos, G., 2003. Dynamic formulation of a top-down and bottom-up merging energy policy model. Energy Policy 31 (10), 1017-1031.). The labour market is formulated according to the microeconomically founded efficiency wages and calibrated for Switzerland. The study includes the development of a consistent set of top-down, bottom-up and labour data for Switzerland. The collection of bottom-up data on the electricity sector, just before liberalisation, was not easy. The electricity sector characterising data was prepared, based on original statistics about 140 Swiss electricity companies

  15. A new approach for crude oil price analysis based on empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shou-Yang; Lai, K.K.

    2008-01-01

    The importance of understanding the underlying characteristics of international crude oil price movements attracts much attention from academic researchers and business practitioners. Due to the intrinsic complexity of the oil market, however, most of them fail to produce consistently good results. Empirical Mode Decomposition (EMD), recently proposed by Huang et al., appears to be a novel data analysis method for nonlinear and non-stationary time series. By decomposing a time series into a small number of independent and concretely implicational intrinsic modes based on scale separation, EMD explains the generation of time series data from a novel perspective. Ensemble EMD (EEMD) is a substantial improvement of EMD which can better separate the scales naturally by adding white noise series to the original time series and then treating the ensemble averages as the true intrinsic modes. In this paper, we extend EEMD to crude oil price analysis. First, three crude oil price series with different time ranges and frequencies are decomposed into several independent intrinsic modes, from high to low frequency. Second, the intrinsic modes are composed into a fluctuating process, a slowly varying part and a trend based on fine-to-coarse reconstruction. The economic meanings of the three components are identified as short term fluctuations caused by normal supply-demand disequilibrium or some other market activities, the effect of a shock of a significant event, and a long term trend. Finally, the EEMD is shown to be a vital technique for crude oil price analysis. (author)

  16. Multivariate Empirical Mode Decomposition Based Signal Analysis and Efficient-Storage in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Lu [University of Tennessee, Knoxville (UTK); Albright, Austin P [ORNL; Rahimpour, Alireza [University of Tennessee, Knoxville (UTK); Guo, Jiandong [University of Tennessee, Knoxville (UTK); Qi, Hairong [University of Tennessee, Knoxville (UTK); Liu, Yilu [University of Tennessee (UTK) and Oak Ridge National Laboratory (ORNL)

    2017-01-01

    Wide-area-measurement systems (WAMSs) are used in smart grid systems to enable the efficient monitoring of grid dynamics. However, the overwhelming amount of data and the severe contamination from noise often impede the effective and efficient data analysis and storage of WAMS generated measurements. To solve this problem, we propose a novel framework that takes advantage of Multivariate Empirical Mode Decomposition (MEMD), a fully data-driven approach to analyzing non-stationary signals, dubbed MEMD based Signal Analysis (MSA). The frequency measurements are considered as a linear superposition of different oscillatory components and noise. The low-frequency components, corresponding to the long-term trend and inter-area oscillations, are grouped and compressed by MSA using the mean shift clustering algorithm. Whereas, higher-frequency components, mostly noise and potentially part of high-frequency inter-area oscillations, are analyzed using Hilbert spectral analysis and they are delineated by statistical behavior. By conducting experiments on both synthetic and real-world data, we show that the proposed framework can capture the characteristics, such as trends and inter-area oscillation, while reducing the data storage requirements

  17. Business ethics and economic growth: An empirical analysis for Turkish economy

    Directory of Open Access Journals (Sweden)

    Ekrem Erdem

    2015-12-01

    Full Text Available Purpose – The roots of the science of modern economics are originated from the ideas of Adam Smith who is not a pure economist but a moralist-philosopher. Basic concepts in the Wealth of Nations which is perceived as the hand book of economics depend on the arguments that Adam Smith suggests in his Theory of Moral Sentiments. In this theory, business ethics as a part of the Law of Sympathy appears as one of the factors that provide the invisible hand to operate properly. In light of this property, it is possible to assume business ethics as one of the components of the market mechanism. In this context, this study aims to analyse the link between business ethics and economic growth in the Turkish economy. Design/methodology/approach – The study employs bounced cheques and protested bonds for representing the degradation of business ethics and tries to show how this degradation affects economic growth in the Turkish economy for the period 1988-2013. Findings – Either illustrative or empirical results show that business ethics is an important determinant of economic growth in the Turkish economy and damaging it negatively effects the growth rate of the economy. Research limitations/implications – One of the most restrictive things conducting the present empirical analysis is the lack of various and longer data sets. Using different indicators in terms of business ethics with longer time span will definitely increase the reliability of the study. However, in the current form, results imply a policy that is capable of limiting the failures of business ethics may boost the Turkish economy up. Originality/value – The results tend to support the close link between business ethics and economic growth.

  18. Empirical Analysis for the Heat Exchange Effectiveness of a Thermoelectric Liquid Cooling and Heating Unit

    Directory of Open Access Journals (Sweden)

    Hansol Lim

    2018-03-01

    Full Text Available This study aims to estimate the performance of thermoelectric module (TEM heat pump for simultaneous liquid cooling and heating and propose empirical models for predicting the heat exchange effectiveness. The experiments were conducted to investigate and collect the performance data of TEM heat pump where the working fluid was water. A total of 57 sets of experimental data were statistically analyzed to estimate the effects of each independent variable on the heat exchange effectiveness using analysis of variance (ANOVA. To develop the empirical model, the six design parameters were measured: the number of transfer units (NTU of the heat exchangers (i.e., water blocks, the inlet water temperatures and temperatures of water blocks at the cold and hot sides of the TEM. As a result, two polynomial equations predicting heat exchange effectiveness at the cold and hot sides of the TEM heat pump were derived as a function of the six selected design parameters. Also, the proposed models and theoretical model of conventional condenser and evaporator for heat exchange effectiveness were compared with the additional measurement data to validate the reliability of the proposed models. Consequently, two conclusions have been made: (1 the possibility of using the TEM heat pump for simultaneous cooling and heating was examined with the maximum temperature difference of 30 °C between cold and hot side of TEM, and (2 it is revealed that TEM heat pump has difference with the conventional evaporator and condenser from the comparison results between the proposed models and theoretical model due to the heat conduction and Joule effect in TEM.

  19. The impact of organizational factors on-business adoption: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Marta García-Moreno

    2018-06-01

    Full Text Available Purpose: Provide empirical validation of the model developed by García Moreno et al. (2016 on the factors influencing the adoption of e-business in firms. Design/methodology/approach: Consideration is given to the method for measuring each one of the variables included in the model. Use has been made of the e-Business Watch database, which contains measures for the theoretical model’s three categories: firm, technology, and environment. Multinomial logistic regression models have been provided. Findings: The variables included have revealed significant statistical relationships for the model in question, although the intensity of the relationships differs. the variables related to the environment also reveal statistically significant relationships, whereby the attitude of trading partners appears to have a relevant and growing impact on e-business adoption. Research limitations/implications: Data come from just one database: the e-Business Watch database/enriched data from alternative databases could be included. Practical implications: The infrastructure of information and communications technologies (ICTs is confirmed to be a determining factor in e-business development. Nevertheless, the effect of competitor rivalry has a more erratic influence that is encapsulated in a significant relationship in intermediate models, with a sharper increase in the likelihood of being in the category of customer-focused firms, and less internally focused. Social implications: The human capital linked to ICTs is a driving force behind the adoption of these practices. Albeit with a more moderate effect, note should also be taken of the capacity for entering into relationships with third parties within the scope of ICTs, with significant effects that become more robust as they are tested in models that seek to explain the probability of recording higher levels of e-business adoption. Originality/value: The article presents a first empirical analysis to

  20. Application of a latent class analysis to empirically define eating disorder phenotypes.

    Science.gov (United States)

    Keel, Pamela K; Fichter, Manfred; Quadflieg, Norbert; Bulik, Cynthia M; Baxter, Mark G; Thornton, Laura; Halmi, Katherine A; Kaplan, Allan S; Strober, Michael; Woodside, D Blake; Crow, Scott J; Mitchell, James E; Rotondo, Alessandro; Mauri, Mauro; Cassano, Giovanni; Treasure, Janet; Goldman, David; Berrettini, Wade H; Kaye, Walter H

    2004-02-01

    Diagnostic criteria for eating disorders influence how we recognize, research, and treat eating disorders, and empirically valid phenotypes are required for revealing their genetic bases. To empirically define eating disorder phenotypes. Data regarding eating disorder symptoms and features from 1179 individuals with clinically significant eating disorders were submitted to a latent class analysis. The resulting latent classes were compared on non-eating disorder variables in a series of validation analyses. Multinational, collaborative study with cases ascertained through diverse clinical settings (inpatient, outpatient, and community). Members of affected relative pairs recruited for participation in genetic studies of eating disorders in which probands met DSM-IV-TR criteria for anorexia nervosa (AN) or bulimia nervosa and had at least 1 biological relative with a clinically significant eating disorder. Main Outcome Measure Number and clinical characterization of latent classes. A 4-class solution provided the best fit. Latent class 1 (LC1) resembled restricting AN; LC2, AN and bulimia nervosa with the use of multiple methods of purging; LC3, restricting AN without obsessive-compulsive features; and LC4, bulimia nervosa with self-induced vomiting as the sole form of purging. Biological relatives were significantly likely to belong to the same latent class. Across validation analyses, LC2 demonstrated the highest levels of psychological disturbance, and LC3 demonstrated the lowest. The presence of obsessive-compulsive features differentiates among individuals with restricting AN. Similarly, the combination of low weight and multiple methods of purging distinguishes among individuals with binge eating and purging behaviors. These results support some of the distinctions drawn within the DSM-IV-TR among eating disorder subtypes, while introducing new features to define phenotypes.

  1. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  2. Spurious Seasonality Detection: A Non-Parametric Test Proposal

    Directory of Open Access Journals (Sweden)

    Aurelio F. Bariviera

    2018-01-01

    Full Text Available This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies.

  3. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.

    Science.gov (United States)

    García, Constantino A; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  4. Nonparametric estimation of benchmark doses in environmental risk assessment

    Science.gov (United States)

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  5. A nonparametric test for industrial specialization

    OpenAIRE

    Billings, Stephen B.; Johnson, Erik B.

    2010-01-01

    Urban economists hypothesize that industrial diversity matters for urban growth and development, but metrics for empirically testing this relationship are limited to simple concentration metrics (e.g. location quotient) or summary diversity indices (e.g. Gini, Herfindahl). As shown by recent advances in how we measure localization and specialization, these measures of industrial diversity may be subject to bias under small samples or the Modifiable Areal Unit Problem. Furthermore, empirically...

  6. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  7. Simple nonparametric checks for model data fit in CAT

    NARCIS (Netherlands)

    Meijer, R.R.

    2005-01-01

    In this paper, the usefulness of several nonparametric checks is discussed in a computerized adaptive testing (CAT) context. Although there is no tradition of nonparametric scalability in CAT, it can be argued that scalability checks can be useful to investigate, for example, the quality of item

  8. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    Gugushvili, S.; van der Meulen, F.; Spreij, P.

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context,

  9. Why do electricity utilities cooperate with coal suppliers? A theoretical and empirical analysis from China

    International Nuclear Information System (INIS)

    Zhao Xiaoli; Lyon, Thomas P.; Wang Feng; Song Cui

    2012-01-01

    The asymmetry of Chinese coal and electricity pricing reforms leads to serious conflict between coal suppliers and electricity utilities. Electricity utilities experience significant losses as a result of conflict: severe coal price fluctuations, and uncertainty in the quantity and quality of coal supplies. This paper explores whether establishing cooperative relationships between coal suppliers and electricity utilities can resolve conflicts. We begin with a discussion of the history of coal and electricity pricing reforms, and then conduct a theoretical analysis of relational contracting to provide a new perspective on the drivers behind the establishment of cooperative relationships between the two parties. Finally, we empirically investigate the role of cooperative relationships and the establishment of mine-mouth power plants on the performance of electricity utilities. The results show that relational contracting between electricity utilities and coal suppliers improves the market performance of electricity utilities; meanwhile, the transportation cost savings derived from mine-mouth power plants are of importance in improving the performance of electricity utilities. - Highlights: ► We discuss the history of coal and electricity pricing reforms. ► The roots of conflicts between electricity and coal firms are presented. ► We conduct a theoretical analysis of relational contracting. ► The role of mine-mouth power plants on the performance of power firms is examined.

  10. Promoting Sustainability Transparency in European Local Governments: An Empirical Analysis Based on Administrative Cultures

    Directory of Open Access Journals (Sweden)

    Andrés Navarro-Galera

    2017-03-01

    Full Text Available Nowadays, the transparency of governments with respect to the sustainability of public services is a very interesting issue for stakeholders and academics. It has led to previous research and international organisations (EU, IMF, OECD, United Nations, IFAC, G-20, World Bank to recommend promotion of the online dissemination of economic, social and environmental information. Based on previous studies about e-government and the influence of administrative cultures on governmental accountability, this paper seeks to identify political actions useful to improve the practices of transparency on economic, social and environmental sustainability in European local governments. We perform a comparative analysis of sustainability information published on the websites of 72 local governments in 10 European countries grouped into main three cultural contexts (Anglo-Saxon, Southern European and Nordic. Using international sustainability reporting guidelines, our results reveal significant differences in local government transparency in each context. The most transparent local governments are the Anglo-Saxon ones, followed by Southern European and Nordic governments. Based on individualized empirical results for each administrative style, our conclusions propose useful policy interventions to enhance sustainability transparency within each cultural tradition, such as development of legal rules on transparency and sustainability, tools to motivate local managers for online diffusion of sustainability information and analysis of information needs of stakeholders.

  11. Combination of canonical correlation analysis and empirical mode decomposition applied to denoising the labor electrohysterogram.

    Science.gov (United States)

    Hassan, Mahmoud; Boudaoud, Sofiane; Terrien, Jérémy; Karlsson, Brynjar; Marque, Catherine

    2011-09-01

    The electrohysterogram (EHG) is often corrupted by electronic and electromagnetic noise as well as movement artifacts, skeletal electromyogram, and ECGs from both mother and fetus. The interfering signals are sporadic and/or have spectra overlapping the spectra of the signals of interest rendering classical filtering ineffective. In the absence of efficient methods for denoising the monopolar EHG signal, bipolar methods are usually used. In this paper, we propose a novel combination of blind source separation using canonical correlation analysis (BSS_CCA) and empirical mode decomposition (EMD) methods to denoise monopolar EHG. We first extract the uterine bursts by using BSS_CCA then the biggest part of any residual noise is removed from the bursts by EMD. Our algorithm, called CCA_EMD, was compared with wavelet filtering and independent component analysis. We also compared CCA_EMD with the corresponding bipolar signals to demonstrate that the new method gives signals that have not been degraded by the new method. The proposed method successfully removed artifacts from the signal without altering the underlying uterine activity as observed by bipolar methods. The CCA_EMD algorithm performed considerably better than the comparison methods.

  12. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    Science.gov (United States)

    Wen-Bo, Wang; Xiao-Dong, Zhang; Yuchan, Chang; Xiang-Li, Wang; Zhao, Wang; Xi, Chen; Lei, Zheng

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. Project supported by the National Science and Technology, China (Grant No. 2012BAJ15B04), the National Natural Science Foundation of China (Grant Nos. 41071270 and 61473213), the Natural Science Foundation of Hubei Province, China (Grant No. 2015CFB424), the State Key Laboratory Foundation of Satellite Ocean Environment Dynamics, China (Grant No. SOED1405), the Hubei Provincial Key Laboratory Foundation of Metallurgical Industry Process System Science, China (Grant No. Z201303), and the Hubei Key Laboratory Foundation of Transportation Internet of Things, Wuhan University of Technology, China (Grant No.2015III015-B02).

  13. Analysis of empirical determinants of credit risk in the banking sector of the Republic of Serbia

    Directory of Open Access Journals (Sweden)

    Račić Željko

    2016-01-01

    Full Text Available The aim of this paper is the detection and analysis of empirical determinants of credit risk in the banking sector of the Republic of Serbia. The paper is based on an analysis of results of the application of the linear regression model, during the period from the third quarter of 2008 to the third quarter of 2014. There are three main findings. Firstly, the higher lending activity of banks contributes to the increasing share of high-risk loans in the total withdrawn loans (delayed effect of 3 years. Secondly, the growth of loans as opposed to deposits contributes to the increased exposure of banks to credit risk. Thirdly, the factors that reduce the exposure of banks to credit risk increase profitability, growth of interest rate spread and real GDP growth. Bearing in mind the overall market conditions and dynamics of the economic recovery of the country, there is a general conclusion based on the results that in the coming period the question of non-performing loans (NPLs in the Republic of Serbia will present a challenge for both lenders and borrowers.

  14. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  15. Empirical Hamiltonians

    International Nuclear Information System (INIS)

    Peggs, S.; Talman, R.

    1987-01-01

    As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single man, which can be processed far faster. It is assumed for this method that a conventional program exists which can perform faithful tracking in the lattice under study for some hundreds of turns, with all lattice parameters held constant. An empirical map is then generated by comparison with the tracking program. A procedure has been outlined for determining an empirical Hamiltonian, which can represent motion through many nonlinear kicks, by taking data from a conventional tracking program. Though derived by an approximate method this Hamiltonian is analytic in form and can be subjected to further analysis of varying degrees of mathematical rigor. Even though the empirical procedure has only been described in one transverse dimension, there is good reason to hope that it can be extended to include two transverse dimensions, so that it can become a more practical tool in realistic cases

  16. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  17. Modeling ionospheric foF2 by using empirical orthogonal function analysis

    Directory of Open Access Journals (Sweden)

    E. A

    2011-08-01

    Full Text Available A similar-parameters interpolation method and an empirical orthogonal function analysis are used to construct empirical models for the ionospheric foF2 by using the observational data from three ground-based ionosonde stations in Japan which are Wakkanai (Geographic 45.4° N, 141.7° E, Kokubunji (Geographic 35.7° N, 140.1° E and Yamagawa (Geographic 31.2° N, 130.6° E during the years of 1971–1987. The impact of different drivers towards ionospheric foF2 can be well indicated by choosing appropriate proxies. It is shown that the missing data of original foF2 can be optimal refilled using similar-parameters method. The characteristics of base functions and associated coefficients of EOF model are analyzed. The diurnal variation of base functions can reflect the essential nature of ionospheric foF2 while the coefficients represent the long-term alteration tendency. The 1st order EOF coefficient A1 can reflect the feature of the components with solar cycle variation. A1 also contains an evident semi-annual variation component as well as a relatively weak annual fluctuation component. Both of which are not so obvious as the solar cycle variation. The 2nd order coefficient A2 contains mainly annual variation components. The 3rd order coefficient A3 and 4th order coefficient A4 contain both annual and semi-annual variation components. The seasonal variation, solar rotation oscillation and the small-scale irregularities are also included in the 4th order coefficient A4. The amplitude range and developing tendency of all these coefficients depend on the level of solar activity and geomagnetic activity. The reliability and validity of EOF model are verified by comparison with observational data and with International Reference Ionosphere (IRI. The agreement between observations and EOF model is quite well, indicating that the EOF model can reflect the major changes and the temporal distribution characteristics of the mid-latitude ionosphere of the

  18. Research on Browsing Behavior in the Libraries: An Empirical Analysis of Consequences, Success and Influences

    Directory of Open Access Journals (Sweden)

    Shan-Ju L. Chang

    2000-12-01

    Full Text Available Browsing as an important part of human information behavior has been observed and investigated in the context of information seeking in the library in general and has assumed greater importance in human-machine interaction in particular. However, the nature and consequences of browsing are not well understood, and little is known of the success rate of such behavior.In this research, exploratory empirical case studies from three types of libraries were conducted, using questionnaires, observation logs, interviews, and computer search logs, to derive the empirical evidence to understand, from the user point of view, what are the consequences of browsing, what constitutes successful browsing, and what factors influence the extent of browsing. Content analysis and statistical analysis were conducted to analyze and synthesize the data. The research results show: (1 There are nine categories of the consequence of browsing, including accidental findings, modification of information need, found the desirable information, learning, feeling relaxation/recreational, information gathering, keeping updated, satisfying curiosity, and not finding what is needed. (2 Four factors that produce successful browsing: intention, the amount or quality of information, the utility of what is found, and help for solving problem or making judgment. (3 Three types of reasons for unsuccessful experience in browsing: not finding what one wanted, inadequate volume or quality of information, and not finding some things useful or interesting. (4 Three types of reasons for partial success: found the intended object but not happy with the quality or amount of information in it, not finding what one wanted but discovering new or potential useful information, not accomplish one purpose but achieve another one given multiple purposes. (5 The influential factors that affect the extent one engages in browsing include browser’s time, scheme of information organization, proximity to

  19. Weak Form Efficiency of the Nigerian Stock Market: An Empirical Analysis (1984 – 2009

    Directory of Open Access Journals (Sweden)

    Pyemo Afego

    2012-01-01

    Full Text Available This paper examines the weak-form of the efficient markets hypothesis for the Nigerian Stock Exchange (NSE by testing for random walks in the monthly index returns over the period 1984-2009. The results of the non-parametric runs test show that index returns on the NSE display a predictable component, thus suggesting that traders can earn superior returns by employing trading rules. Statistically significant deviations from randomness are also suggestive of sub-optimal allocation of investment capital within the economy. The findings, in general, contradict the weak-form of the efficient markets hypothesis, and a range of policy strategies for improving the allocative capacity and quality of the information environment of the NSE are discussed.

  20. Weak Form Efficiency of the Chittagong Stock Exchange: An Empirical Analysis (2006-2016

    Directory of Open Access Journals (Sweden)

    Shahadat Hussain

    2017-01-01

    Full Text Available We study the random walk behavior of Chittagong Stock Exchange (CSE by using daily returns of three indices for the period of 2006 to 2016 employing both non-parametric test (run test and parametric tests [autocorrelation coefficient test, Ljung– Box (LB statistics]. The skewness and kurtosis properties of daily return series are non-normal, with a hint of positively skewed and leptokurtic distribution. The results of run test; autocorrelation and Ljung–Box (LB statistics provide evidences against random walk behavior in the Chittagong Stock Exchange. Overall our result suggest that Chittagong Stock Exchange does not exhibit weak form of efficiency. Hence, there is opportunity of generating a superior return by the active investors.

  1. 6 essays about auctions: a theoretical and empirical analysis. Application to power markets

    International Nuclear Information System (INIS)

    Lamy, L.

    2007-06-01

    This thesis is devoted to a theoretical and empirical analysis of auction mechanisms. Motivated by allocation issues in network industries, in particular by the liberalization of the electricity sector, it focus on auctions with externalities (either allocative or informational) and on multi-objects auctions. After an introduction which provides a survey of the use and the analysis of auctions in power markets, six chapters make this thesis. The first one considers standard auctions in Milgrom-Weber's model with interdependent valuations when the seller can not commit not to participate in the auction. The second and third chapters study the combinatorial auction mechanism proposed by Ausubel and Milgrom. The first of these two studies proposes a modification of this format with a final discount stage and clarifies the theoretical status of those formats, in particular the conditions such that truthful reporting is a dominant strategy. Motivated by the robustness issues of the generalizations of the Ausubel-Milgrom and the Vickrey combinatorial auctions to environments with allocative externalities between joint-purchasers, the second one characterizes the buyer-sub-modularity condition in a general model with allocative identity-dependent externalities between purchasers. In a complete information setup, the fourth chapter analyses the optimal design problem when the commitment abilities of the principal are reduced, namely she can not commit to a simultaneous participation game. The fifth chapter is devoted to the structural analysis of the private value auction model for a single-unit when the econometrician can not observe bidders' identities. The asymmetric independent private value (IPV) model is identified. A multi-step kernel-based estimator is proposed and shown to be asymptotically optimal. Using auctions data for the anglo-french electric Interconnector, the last chapter analyses a multi-unit ascending auctions through reduced forms. (author)

  2. Motivational factors influencing the homeowners’ decisions between residential heating systems: An empirical analysis for Germany

    International Nuclear Information System (INIS)

    Michelsen, Carl Christian; Madlener, Reinhard

    2013-01-01

    Heating demand accounts for a large fraction of the overall energy demand of private households in Germany. A better understanding of the adoption and diffusion of energy-efficient and renewables-based residential heating systems (RHS) is of high policy relevance, particularly against the background of climate change, security of energy supply and increasing energy prices. In this paper, we explore the multi-dimensionality of the homeowners’ motivation to decide between competing RHS. A questionnaire survey (N=2440) conducted in 2010 among homeowners who had recently installed a RHS provides the empirical foundation. Principal component analysis shows that 25 items capturing different adoption motivations can be grouped around six dimensions: (1) cost aspects, (2) general attitude towards the RHS, (3) government grant, (4) reactions to external threats (i.e., environmental or energy supply security considerations), (5) comfort considerations, and (6) influence of peers. Moreover, a cluster analysis with the identified motivational factors as segmentation variables reveals three adopter types: (1) the convenience-oriented, (2) the consequences-aware, and (3) the multilaterally-motivated RHS adopter. Finally, we show that the influence of the motivational factors on the adoption decision also differs by certain characteristics of the homeowner and features of the home. - Highlights: ► Study of the multi-dimensionality of the motivation to adopt residential heating systems (RHS). ► Principal component and cluster analysis are applied to representative survey data for Germany. ► Motivation has six dimensions, including rational decision-making and emotional factors. ► Adoption motivation differs by certain characteristics of the homeowner and of the home. ► Many adopters are driven by existing habits and perceptions about the convenience of the RHS

  3. Liquidity Risk Management: An Empirical Analysis on Panel Data Analysis and ISE Banking Sector

    OpenAIRE

    Sibel ÇELİK; Yasemin Deniz AKARIM

    2012-01-01

    In this paper, we test the factors affecting liquidity risk management in banking sector in Turkey by using panel regression analysis. We use the data for 9 commercial banks traded in Istanbul Stock Exchange for the period 1998-2008. In conclusion, we find that risky liquid assets and return on equity variables are negatively related with liquidity risk. However, external financing and return on asset variables are positively related with liquidity risk. This finding is importance for banks s...

  4. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  5. On the relationship between fiscal plans in the European Union: An empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.M.W.J.

    2007-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  6. On the relationship between fiscal plans in the European Union: an empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.

    2008-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  7. Antecedents and Consequences of Individual Performance Analysis of Turnover Intention Model (Empirical Study of Public Accountants in Indonesia)

    OpenAIRE

    Raza, Hendra; Maksum, Azhar; Erlina; Lumban Raja, Prihatin

    2014-01-01

    Azhar Maksum This study aims to examine empirically the antecedents of individual performance on its consequences of turnover intention in public accounting firms. There are eight variables measured which consists of auditors' empowerment, innovation professionalism, role ambiguity, role conflict, organizational commitment, individual performance and turnover intention. Data analysis is based on 163 public accountant using the Structural Equation Modeling assisted with an appli...

  8. Characterizing Social Interaction in Tobacco-Oriented Social Networks: An Empirical Analysis.

    Science.gov (United States)

    Liang, Yunji; Zheng, Xiaolong; Zeng, Daniel Dajun; Zhou, Xingshe; Leischow, Scott James; Chung, Wingyan

    2015-06-19

    Social media is becoming a new battlefield for tobacco "wars". Evaluating the current situation is very crucial for the advocacy of tobacco control in the age of social media. To reveal the impact of tobacco-related user-generated content, this paper characterizes user interaction and social influence utilizing social network analysis and information theoretic approaches. Our empirical studies demonstrate that the exploding pro-tobacco content has long-lasting effects with more active users and broader influence, and reveal the shortage of social media resources in global tobacco control. It is found that the user interaction in the pro-tobacco group is more active, and user-generated content for tobacco promotion is more successful in obtaining user attention. Furthermore, we construct three tobacco-related social networks and investigate the topological patterns of these tobacco-related social networks. We find that the size of the pro-tobacco network overwhelms the others, which suggests a huge number of users are exposed to the pro-tobacco content. These results indicate that the gap between tobacco promotion and tobacco control is widening and tobacco control may be losing ground to tobacco promotion in social media.

  9. An empirical analysis of the impact of renewable energy deployment on local sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Del Rio, Pablo [Institute for Public Goods and Policies (IPP), Centro de Ciencias Humanas y Sociales, Consejo Superior de Investigaciones Cientificas (CSIC), C/Albasanz 26-28, 28037 Madrid (Spain); Burguillo, Mercedes [Facultad de Ciencias Economicas y Empresariales, Universidad de Alcala, Pza. de la Victoria 3, 28802 Alcala de Henares, Madrid (Spain)

    2009-08-15

    It is usually mentioned that renewable energy sources (RES) have a large potential to contribute to the sustainable development of specific territories by providing them with a wide variety of socioeconomic benefits, including diversification of energy supply, enhanced regional and rural development opportunities, creation of a domestic industry and employment opportunities. The analysis of these benefits has usually been too general (i.e., mostly at the national level) and a focus on the regional and especially the local level has been lacking. This paper empirically analyses those benefits, by applying a conceptual and methodological framework previously developed by the authors to three renewable energy technologies in three different places in Spain. With the help of case studies, the paper shows that the contribution of RES to the economic and social dimensions of sustainable development might be significant. Particularly important is employment creation in these areas. Although, in absolute terms, the number of jobs created may not be high, it may be so with respect to the existing jobs in the areas considered. Socioeconomic benefits depend on several factors, and not only on the type of renewable energy, as has usually been mentioned. The specific socioeconomic features of the territories, including the productive structure of the area, the relationships between the stakeholders and the involvement of the local actors in the renewable energy project may play a relevant role in this regard. Furthermore, other local (socioeconomic) sustainability aspects beyond employment creation should be considered. (author)

  10. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  11. Dynamic analysis of interhospital collaboration and competition: empirical evidence from an Italian regional health system.

    Science.gov (United States)

    Mascia, Daniele; Di Vincenzo, Fausto; Cicchetti, Americo

    2012-05-01

    Policymakers stimulate competition in universalistic health-care systems while encouraging the formation of service provision networks among hospital organizations. This article addresses a gap in the extant literature by empirically analyzing simultaneous collaboration and competition between hospitals within the Italian National Health Service, where important procompetition reforms have been implemented. To explore how rising competition between hospitals relates to their propensity to collaborate with other local providers. Longitudinal data on interhospital collaboration and competition collected in an Italian region from 2003 to 2007 are analyzed. Social network analysis techniques are applied to study the structure and dynamics of interhospital collaboration. Negative binomial regressions are employed to explore how interhospital competition relates to the collaborative network over time. Competition among providers does not hinder interhospital collaboration. Collaboration is primarily local, with resource complementarity and differentials in the volume of activity and hospital performance explaining the propensity to collaborate. Formation of collaborative networks among hospitals is not hampered by reforms aimed at fostering market forces. Because procompetition reforms elicit peculiar forms of managed competition in universalistic health systems, studies are needed to clarify whether the positive association between interhospital competition and collaboration can be generalized to other health-care settings. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Review of US ESCO industry market trends: an empirical analysis of project data

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, C.A.; Hopper, N.C.; Osborn, J.G. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States). Energy Analysis

    2005-02-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of {approx}1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m{sup 2}/year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ECSOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers. (author)

  13. The Determinants of the Global Mobile Telephone Deployment: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Sheikh Taher ABU

    2010-01-01

    Full Text Available This study aims to analyze the global mobile phones by examining the instruments stimulating the diffusion pattern. A rigorous demand model is estimated using global mobile telecommu-nications panel dataset comprised with 51 countries classified in order to World Bank income categories from 1990-2007. In particular, the paper examines what factors contribute the most to the deployment of global mobile telephones. To construct an econometric model, the number of subscribers to mobile phone per 100 inhabitants is taken as dependent variable, while the following groups of variables (1 GDP per capita income and charges, (2 competition policies (3 telecom infrastructure (4 technological innovations (5 others are selected as independent variables. Estimation results report the presence of substantial disparity among groups. Additionally GDP per capita income and own-price elasticity comprised with call rate, subscription charges, are reported. The analysis of impulse responses for price, competition policies, and technological innovations such as digitalization of mobile network, mobile network coverage indicates that substantial mobile telephone growth is yet to be realized especially in developing countries. A new and important empirical finding is that there are still many opportunities available for mobile phone development in the world pro-poor nations by providing better telecom infrastructure.

  14. Operational Practices and Financial Performance: an Empirical Analysis of Brazilian Manufacturing Companies

    Directory of Open Access Journals (Sweden)

    André Luís de Castro Moura Duarte

    2011-10-01

    Full Text Available In the operations management field, operational practices like total quality management or just in time have been seen as a way to improve operational performance and ultimately financial performance. Empirical support for this effect of operational practices in financial performance has been, however, limited due to research design and the inherent difficulties of using performance as a dependent variable. In this paper, we tested the relationship between selected operational practices (quality management, just in time, ISO certification and services outsourcing in financial performance outcomes of profitability and growth. A sample of 1200 firms, operating in São Paulo, Brazil, was used. Analysis using multiple regression explored the direct effect of practices and their interaction with industry dummies. Results did not support the existence of a positive relationship with financial performance. A negative relationship of outsourcing with both profitability and growth was found, supporting some critical views of the outsourcing practice. A weaker negative relationship between ISO certification and growth was also found. Some interactions between practices and industries were also significant, with mixed results, indicating that the effect of practices on performance might be context dependent.

  15. Investigating properties of the cardiovascular system using innovative analysis algorithms based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Yeh, Jia-Rong; Lin, Tzu-Yu; Chen, Yun; Sun, Wei-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2012-01-01

    Cardiovascular system is known to be nonlinear and nonstationary. Traditional linear assessments algorithms of arterial stiffness and systemic resistance of cardiac system accompany the problem of nonstationary or inconvenience in practical applications. In this pilot study, two new assessment methods were developed: the first is ensemble empirical mode decomposition based reflection index (EEMD-RI) while the second is based on the phase shift between ECG and BP on cardiac oscillation. Both methods utilise the EEMD algorithm which is suitable for nonlinear and nonstationary systems. These methods were used to investigate the properties of arterial stiffness and systemic resistance for a pig's cardiovascular system via ECG and blood pressure (BP). This experiment simulated a sequence of continuous changes of blood pressure arising from steady condition to high blood pressure by clamping the artery and an inverse by relaxing the artery. As a hypothesis, the arterial stiffness and systemic resistance should vary with the blood pressure due to clamping and relaxing the artery. The results show statistically significant correlations between BP, EEMD-based RI, and the phase shift between ECG and BP on cardiac oscillation. The two assessments results demonstrate the merits of the EEMD for signal analysis.

  16. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers

  17. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, C.A.; Hopper, N.C.; Osborn, J.G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ECSOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers. (author)

  18. An empirical analysis of strategy implementation process and performance of construction companies

    Science.gov (United States)

    Zaidi, F. I.; Zawawi, E. M. A.; Nordin, R. M.; Ahnuar, E. M.

    2018-02-01

    Strategy implementation is known as action stage where it is to be considered as the most difficult stage in strategic planning. Strategy implementation can influence the whole texture of a company including its performance. The aim of this research is to provide the empirical relationship between strategy implementation process and performance of construction companies. This research establishes the strategy implementation process and how it influences the performance of construction companies. This research used quantitative method approached via questionnaire survey. Respondents were G7 construction companies in Klang Valley, Selangor. Pearson correlation analysis indicate a strong positive relationship between strategy implementation process and construction companies’ performance. The most importance part of strategy implementation process is to provide sufficient training for employees which directly influence the construction companies’ profit growth and employees’ growth. This research results will benefit top management in the construction companies to conduct strategy implementation in their companies. This research may not reflect the whole construction industry in Malaysia. Future research may be resumed to small and medium grades contractors and perhaps in other areas in Malaysia.

  19. Empirical analysis of online social networks in the age of Web 2.0

    Science.gov (United States)

    Fu, Feng; Liu, Lianghuan; Wang, Long

    2008-01-01

    Today the World Wide Web is undergoing a subtle but profound shift to Web 2.0, to become more of a social web. The use of collaborative technologies such as blogs and social networking site (SNS) leads to instant online community in which people communicate rapidly and conveniently with each other. Moreover, there are growing interest and concern regarding the topological structure of these new online social networks. In this paper, we present empirical analysis of statistical properties of two important Chinese online social networks-a blogging network and an SNS open to college students. They are both emerging in the age of Web 2.0. We demonstrate that both networks possess small-world and scale-free features already observed in real-world and artificial networks. In addition, we investigate the distribution of topological distance. Furthermore, we study the correlations between degree (in/out) and degree (in/out), clustering coefficient and degree, popularity (in terms of number of page views) and in-degree (for the blogging network), respectively. We find that the blogging network shows disassortative mixing pattern, whereas the SNS network is an assortative one. Our research may help us to elucidate the self-organizing structural characteristics of these online social networks embedded in technical forms.

  20. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    Science.gov (United States)

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  1. Security Vulnerability Profiles of Mission Critical Software: Empirical Analysis of Security Related Bug Reports

    Science.gov (United States)

    Goseva-Popstojanova, Katerina; Tyo, Jacob

    2017-01-01

    While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.

  2. Review of U.S. ESCO industry market trends: An empirical analysis of project data

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.; Singer, Terry E.

    2003-03-01

    This article summarizes a comprehensive empirical analysis of U.S. Energy Service Company (ESCO) industry trends and performance. We employ two parallel analytical approaches: a comprehensive survey of firms to estimate total industry size and a database of {approx}1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US $2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m2/year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling ''energy solutions,'' with energy efficiency part of a package. We conclude that a private sector energy-efficiency services industry that targets large commercial and industrial customers is viable and self-sustaining with appropriate policy support both financial and non-financial.

  3. Stakeholders of Voluntary Forest Carbon Offset Projects in China: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Derong Lin

    2015-01-01

    Full Text Available Climate change is one of the defining challenges facing the planet. Voluntary forest carbon offset project which has the potential to boost forest carbon storage and mitigate global warming has aroused the global concern. The objective of this paper is to model the game situation and analyze the game behaviors of stakeholders of voluntary forest carbon offset projects in China. A stakeholder model and a Power-Benefit Matrix are constructed to analyze the roles, behaviors, and conflicts of stakeholders including farmers, planting entities, communities, government, and China Green Carbon Foundation. The empirical analysis results show that although the stakeholders have diverse interests and different goals, a win-win solution is still possible through their joint participation and compromise in the voluntary forest carbon offset project. A wide governance structure laying emphasis on benefit balance, equality, and information exchanges and being regulated by all stakeholders has been constructed. It facilitates the agreement among the stakeholders with conflicting or different interests. The joint participation of stakeholders in voluntary forest carbon offset projects might change the government-dominated afforestation/reforestation into a market, where all participators including government are encouraged to cooperate with each other to improve the condition of fund shortage and low efficiency.

  4. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    Oladosu, Gbadebo

    2009-01-01

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  5. Anterior temporal face patches: A meta-analysis and empirical study

    Directory of Open Access Journals (Sweden)

    Rebecca J. Von Der Heide

    2013-02-01

    Full Text Available Studies of nonhuman primates have reported face sensitive patches in the ventral anterior temporal lobes (ATL. In humans, ATL resection or damage causes an associative prosopagnosia in which face perception is intact but face memory is compromised. Some fMRI studies have extended these findings using famous and familiar faces. However, it is unclear whether these regions in the human ATL are in locations comparable to those reported in non-human primates, typically using unfamiliar faces. We present the results of two studies of person memory: a meta-analysis of existing fMRI studies and an empirical fMRI study using optimized imaging parameters. Both studies showed left-lateralized ATL activations to familiar individuals while novel faces activated the right ATL. Activations to famous faces were quite ventral, similar to what has been reported in monkeys. These findings suggest that face memory-sensitive patches in the human ATL are in the ventral/polar ATL.

  6. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia

    2010-01-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to 'translate' the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  7. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  8. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  9. EMPIRICAL ANALYSIS OF CRISIS MANAGEMENT PRACTICES IN TOURISM ENTERPRISES IN TERMS OF ORGANIZATIONAL LEARNING

    Directory of Open Access Journals (Sweden)

    Gülsel ÇİFTÇİ

    2017-04-01

    Full Text Available In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. It is also aimed to contribute to related literature and to offer suggestions that will guide businesses and future studies. Within this context, taking account of 2016 (October data of the Ministry of Culture and Tourism of Turkey, Antalya, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in resort category, and Istanbul, which embodies most certified 5-star hotels, and highest bed capacity and number of rooms in urban hotels category, are included within the scope of this study. It’s decided to conduct this study on hotels, considering the effects of tourism industry on world economy. In this study, it is aimed to empirically analyze the crisis management practices of tourism enterprises in terms of their effects on organizational learning. It is aimed to determine the ways that the hotels in Turkey respond to previous crisis, what kind of precautions were taken and whether these crises had taught anything regarding the operation and performance of the enterprise. A comprehensive literature review was conducted in the first and second part of this three-part study; The concept of crisis management in the enterprises was examined and the applications on tourism enterprises were discussed. The last part of the study contains information on testing and analyzing hypotheses. The data obtained as a result of the questionnaires were analyzed in SPSS (Statistical Package for Social Sciences and LISREL (LInear Structural RELationships program. A Pearson Correlation analysis was conducted to examine the relationship between

  10. Cycling empirical antibiotic therapy in hospitals: meta-analysis and models.

    Directory of Open Access Journals (Sweden)

    Pia Abel zur Wiesch

    2014-06-01

    Full Text Available The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling. Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43-0.48] and resistant infections by 7.2 [14.00-0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing. We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call "adjustable cycling/mixing". In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that "adjustable cycling" is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that

  11. Risk and protective factors of internet addiction: a meta-analysis of empirical studies in Korea.

    Science.gov (United States)

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-11-01

    A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and pathological, problematic, and excessive Internet use. Only original research papers using Korean samples published from 1999 to 2012 and officially reviewed by peers were included for analysis. Ninety-five studies meeting the inclusion criteria were identified. The magnitude of the overall effect size of the intrapersonal variables associated with internet addiction was significantly higher than that of interpersonal variables. Specifically, IA demonstrated a medium to strong association with "escape from self" and "self-identity" as self-related variables. "Attention problem", "self-control", and "emotional regulation" as control and regulation-relation variables; "addiction and absorption traits" as temperament variables; "anger" and "aggression" as emotion and mood and variables; "negative stress coping" as coping variables were also associated with comparably larger effect sizes. Contrary to our expectation, the magnitude of the correlations between relational ability and quality, parental relationships and family functionality, and IA were found to be small. The strength of the association between IA and the risk and protective factors was found to be higher in younger age groups. The findings highlight a need for closer examination of psychosocial factors, especially intrapersonal variables when assessing high-risk individuals and designing intervention strategies for both general IA and Internet game addiction.

  12. Risk and Protective Factors of Internet Addiction: A Meta-Analysis of Empirical Studies in Korea

    Science.gov (United States)

    Koo, Hoon Jung

    2014-01-01

    Purpose A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Materials and Methods Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and pathological, problematic, and excessive Internet use. Only original research papers using Korean samples published from 1999 to 2012 and officially reviewed by peers were included for analysis. Ninety-five studies meeting the inclusion criteria were identified. Results The magnitude of the overall effect size of the intrapersonal variables associated with internet addiction was significantly higher than that of interpersonal variables. Specifically, IA demonstrated a medium to strong association with "escape from self" and "self-identity" as self-related variables. "Attention problem", "self-control", and "emotional regulation" as control and regulation-relation variables; "addiction and absorption traits" as temperament variables; "anger" and "aggression" as emotion and mood and variables; "negative stress coping" as coping variables were also associated with comparably larger effect sizes. Contrary to our expectation, the magnitude of the correlations between relational ability and quality, parental relationships and family functionality, and IA were found to be small. The strength of the association between IA and the risk and protective factors was found to be higher in younger age groups. Conclusion The findings highlight a need for closer examination of psychosocial factors, especially intrapersonal variables when assessing high-risk individuals and designing intervention strategies for both general IA and Internet game addiction. PMID:25323910

  13. Self-image and Missions of Universities: An Empirical Analysis of Japanese University Executives

    Directory of Open Access Journals (Sweden)

    Masataka Murasawa

    2014-05-01

    Full Text Available As universities in Japan gain institutional autonomy in managing internal organizations, independent of governmental control as a result of deregulation and decentralizing reforms, it is becoming increasingly important that the executives and administrators of each institution demonstrate clear and strategic vision and ideas to external stakeholders, in order to maintain financially robust operations and attractiveness of their institutions. This paper considers whether and how the self-image, mission, and vision of universities are perceived and internalized by the management of Japanese universities and empirically examines the determinants of shaping such individual perceptions. The result of our descriptive analysis indicates that the recent government policy to internationalize domestic universities has not shown much progress in the view of university executives in Japan. An increasing emphasis on the roles of serving local needs in research and teaching is rather pursued by these universities. Individual perceptions among Japanese university executives with regard to the missions and functional roles to be played by their institutions are influenced by managerial rank as well as the field of their academic training. A multiple regression analysis reveals that the economy of scale brought out by an expanded undergraduate student enrollment gradually slows down and decelerate executive perceptions, with regard to establishing a globally recognized status in research and teaching. Moreover, Japanese universities with a small proportion of graduate student enrollment, likely opted out from competitions for gaining a greater respect in the global community of higher education between 2005 and 2012. Finally, the management in universities granted with the same amount of external research funds in both studied years responded more passively in 2012 than did in 2005 on the self-assessment of whether having established a status as a global

  14. An empirical study to determine the critical success factors of export industry

    Directory of Open Access Journals (Sweden)

    Masoud Babakhani

    2011-01-01

    Full Text Available Exporting goods and services play an important role on economy of developing countries. There are many countries in the world whose economy is solely based on exporting raw materials such as oil and gas. Many believe that countries cannot develop their economy as long as they rely on exporting one single group of raw materials. Therefore, there is a need to help other sectors of industries build good infrastructure for exporting diversified products. In this paper, we perform an empirical analysis to determine the critical success factors on exporting different goods. The results are analyzed using some statistical non-parametric methods and some useful guidelines are also suggested.

  15. Voluntary Participation in Community Economic Development in Canada: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Laura Lamb

    2011-01-01

    Full Text Available This article is an empirical analysis of an individual's decision to participate in community economic development (CED initiatives in Canada. The objective of the analysis is to better understand how individuals make decisions to volunteer time toward CED initiatives and to determine whether the determinants of participation in CED are unique when compared to those of participation in volunteer activities in general. The dataset employed is Statistics Canada's 2004 Canada Survey of Giving, Volunteering and Participating (CSGVP. To date, there has been no prior econometric analysis of the decision to participate in community economic development initiatives in Canada. Results suggest a role for both public policymakers and practitioners in influencing participation in CED. / Cet article constitue une analyse empirique du processus de prise de décision chez les individus en ce qui a trait à la participation aux initiatives canadiennes de développement économique communautaire (DÉC. Le but de l'analyse est de mieux comprendre la façon dont les individus prennent la décision de consacrer du temps au bénévolat dans les initiatives de DÉC. Elle sert aussi à trancher la question de savoir si les facteurs de participation aux initiatives de développement économique communautaire sont uniques ou communs à la participation à des activités bénévoles en général. Les données employées dans le cadre de cette analyse sont puisées de l'Enquête canadienne sur le don, le bénévolat et la participation effectuée par Statistique Canada en 2004. À ce jour, aucune analyse économétrique n'a été menée sur la décision de participer aux initiatives canadiennes de DÉC. Les résultats suggèrent que les responsables de l'élaboration des politiques ainsi que les praticiens influencent tous deux la participation aux initiatives de DÉC.

  16. Analysis of surface soil moisture patterns in agricultural landscapes using Empirical Orthogonal Functions

    Directory of Open Access Journals (Sweden)

    W. Korres

    2010-05-01

    Full Text Available Soil moisture is one of the fundamental variables in hydrology, meteorology and agriculture. Nevertheless, its spatio-temporal patterns in agriculturally used landscapes that are affected by multiple natural (rainfall, soil, topography etc. and agronomic (fertilisation, soil management etc. factors are often not well known. The aim of this study is to determine the dominant factors governing the spatio-temporal patterns of surface soil moisture in a grassland and an arable test site that are located within the Rur catchment in Western Germany. Surface soil moisture (0–6 cm was measured in an approx. 50×50 m grid during 14 and 17 measurement campaigns (May 2007 to November 2008 in both test sites. To analyse the spatio-temporal patterns of surface soil moisture, an Empirical Orthogonal Function (EOF analysis was applied and the results were correlated with parameters derived from topography, soil, vegetation and land management to link the patterns to related factors and processes. For the grassland test site, the analysis resulted in one significant spatial structure (first EOF, which explained 57.5% of the spatial variability connected to soil properties and topography. The statistical weight of the first spatial EOF is stronger on wet days. The highest temporal variability can be found in locations with a high percentage of soil organic carbon (SOC. For the arable test site, the analysis resulted in two significant spatial structures, the first EOF, which explained 38.4% of the spatial variability, and showed a highly significant correlation to soil properties, namely soil texture and soil stone content. The second EOF, which explained 28.3% of the spatial variability, is linked to differences in land management. The soil moisture in the arable test site varied more strongly during dry and wet periods at locations with low porosity. The method applied is capable of identifying the dominant parameters controlling spatio-temporal patterns of

  17. An empirical analysis of ERP adoption by oil and gas firms

    Science.gov (United States)

    Romero, Jorge

    2005-07-01

    Despite the growing popularity of enterprise-resource-planning (ERP) systems for the information technology infrastructure of large and medium-sized businesses, there is limited empirical evidence on the competitive benefits of ERP implementations. Case studies of individual firms provide insights but do not provide sufficient evidence to draw reliable inferences and cross-sectional studies of firms in multiple industries provide a broad-brush perspective of the performance effects associated with ERP installations. To narrow the focus to a specific competitive arena, I analyze the impact of ERP adoption on various dimensions of performance for firms in the Oil and Gas Industry. I selected the Oil and Gas Industry because several companies installed a specific type of ERP system, SAP R/3, during the period from 1990 to 2002. In fact, SAP was the dominant provider of enterprise software to oil and gas companies during this period. I evaluate performance of firms that implemented SAP R/3 relative to firms that did not adopt ERP systems in the pre-implementation, implementation and post-implementation periods. My analysis takes two different approaches, the first from a financial perspective and the second from a strategic perspective. Using the Sloan (General Motors) model commonly applied in financial statement analysis, I examine changes in performance for ERP-adopting firms versus non-adopting firms along the dimensions of asset utilization and return on sales. Asset utilization is more closely aligned with changes in leanness of operations, and return on sales is more closely aligned with customer-value-added. I test hypotheses related to the timing and magnitude of the impact of ERP implementation with respect to leanness of operations and customer value added. I find that SAP-adopting companies performed relatively better in terms of asset turnover than non-SAP-adopting companies during both the implementation and post-implementation periods and that SAP

  18. Empirical analysis of vegetation dynamics and the possibility of a catastrophic desertification transition.

    Science.gov (United States)

    Weissmann, Haim; Kent, Rafi; Michael, Yaron; Shnerb, Nadav M

    2017-01-01

    The process of desertification in the semi-arid climatic zone is considered by many as a catastrophic regime shift, since the positive feedback of vegetation density on growth rates yields a system that admits alternative steady states. Some support to this idea comes from the analysis of static patterns, where peaks of the vegetation density histogram were associated with these alternative states. Here we present a large-scale empirical study of vegetation dynamics, aimed at identifying and quantifying directly the effects of positive feedback. To do that, we have analyzed vegetation density across 2.5 × 106 km2 of the African Sahel region, with spatial resolution of 30 × 30 meters, using three consecutive snapshots. The results are mixed. The local vegetation density (measured at a single pixel) moves towards the average of the corresponding rainfall line, indicating a purely negative feedback. On the other hand, the chance of spatial clusters (of many "green" pixels) to expand in the next census is growing with their size, suggesting some positive feedback. We show that these apparently contradicting results emerge naturally in a model with positive feedback and strong demographic stochasticity, a model that allows for a catastrophic shift only in a certain range of parameters. Static patterns, like the double peak in the histogram of vegetation density, are shown to vary between censuses, with no apparent correlation with the actual dynamical features. Our work emphasizes the importance of dynamic response patterns as indicators of the state of the system, while the usefulness of static modality features appears to be quite limited.

  19. A comparative empirical analysis of statistical models for evaluating highway segment crash frequency

    Directory of Open Access Journals (Sweden)

    Bismark R.D.K. Agbelie

    2016-08-01

    Full Text Available The present study conducted an empirical highway segment crash frequency analysis on the basis of fixed-parameters negative binomial and random-parameters negative binomial models. Using a 4-year data from a total of 158 highway segments, with a total of 11,168 crashes, the results from both models were presented, discussed, and compared. About 58% of the selected variables produced normally distributed parameters across highway segments, while the remaining produced fixed parameters. The presence of a noise barrier along a highway segment would increase mean annual crash frequency by 0.492 for 88.21% of the highway segments, and would decrease crash frequency for 11.79% of the remaining highway segments. Besides, the number of vertical curves per mile along a segment would increase mean annual crash frequency by 0.006 for 84.13% of the highway segments, and would decrease crash frequency for 15.87% of the remaining highway segments. Thus, constraining the parameters to be fixed across all highway segments would lead to an inaccurate conclusion. Although, the estimated parameters from both models showed consistency in direction, the magnitudes were significantly different. Out of the two models, the random-parameters negative binomial model was found to be statistically superior in evaluating highway segment crashes compared with the fixed-parameters negative binomial model. On average, the marginal effects from the fixed-parameters negative binomial model were observed to be significantly overestimated compared with those from the random-parameters negative binomial model.

  20. Financial incentives and psychiatric services in Australia: an empirical analysis of three policy changes.

    Science.gov (United States)

    Doessel, D P; Scheurer, Roman W; Chant, David C; Whiteford, Harvey

    2007-01-01

    Australia has a national, compulsory and universal health insurance scheme, called Medicare. In 1996 the Government changed the Medicare Benefit Schedule Book in such a way as to create different financial incentives for consumers or producers of out-of-hospital private psychiatric services, once an individual consumer had received 50 such services in a 12-month period. The Australian Government introduced a new Item (319) to cover some special cases that were affected by the policy change. At the same time, the Commonwealth introduced a 'fee-freeze' for all medical services. The purpose of this study is two-fold. First, it is necessary to describe the three policy interventions (the constraints on utilization, the operation of the new Item and the general 'fee-freeze'.) The new Item policy was essentially a mechanism to 'dampen' the effect of the 'constraint' policy, and these two policy changes will be consequently analysed as a single intervention. The second objective is to evaluate the policy intervention in terms of the (stated) Australian purpose of reducing utilization of psychiatric services, and thus reducing financial outlays. Thus, it is important to separate out the different effects of the three policies that were introduced at much the same time in November 1996 and January 1997. The econometric results indicate that the composite policy change (constraining services and the new 319 Item) had a statistically significant effect. The analysis of the Medicare Benefit (in constant prices) indicates that the 'fee-freeze' policy also had a statistically significant effect. This enables separate determination of the several policy changes. In fact, the empirical results indicate that the Commonwealth Government underestimated the 'savings' that would arise from the 'constraint' policy.

  1. Cloak of compassion, or evidence of elitism? An empirical analysis of white coat ceremonies.

    Science.gov (United States)

    Karnieli-Miller, Orit; Frankel, Richard M; Inui, Thomas S

    2013-01-01

    White coat ceremonies (WCCs) are widely prevalent as a celebration of matriculation in medical schools. Critics have questioned whether these ceremonies can successfully combine the themes of professionalism and humanism, as well as whether the white coat is an appropriate symbol. This study aimed to add a process of empirical assessment to the discussion of these criticisms by analysing the content and messages communicated during these ceremonies. Multiple qualitative methods were used to discern the core meanings expressed in a sample of 18 ceremonies through the analysis of artefacts, words, phrases, statements and narratives. Out of a stratified random sample of 25 US schools of medicine conducting WCCs in 2009, 18 schools submitted video, audio and written materials. All ceremonies followed the same general format, but varied in their content, messages and context. Ceremonies included five principal descriptions of what is symbolised by the white coat, including: commitment to humanistic professional care; a reminder of obligations and privileges; power; the student's need to 'grow', and the white coat as a mantle. Statements about obligations were made three times more frequently than statements about privileges. Key words or phrases in WCCs mapped to four domains: professionalism; morality; humanism, and spirituality. Spoken narratives focused on humility and generosity. The WCCs studied did not celebrate the status of an elite class, but marked the beginning of educational, personal and professional formation processes and urged matriculants to develop into doctors 'worthy of trust'. The ceremonies centred on the persons entering the vocation, who were invited to affirm its calling and obligations by donning a symbolic garb, and to join an ancient and modern tradition of healing and immersion in their community. The schools' articulated construct of the white coat situated it as a symbol of humanism. This study's findings may clarify and guide schools

  2. Agglomeration effects in the labour market: an empirical analysis for Italy

    Directory of Open Access Journals (Sweden)

    Marusca De Castris

    2013-05-01

    Full Text Available Extensive and persistent geographic variability of the unemployment rate within the same region has been attributed to various causes. Some theories identify the “thickness” of markets as the source of positive externalities affecting labour market by improving the ability to match the skills requested by firms with those offered by workers. A recent paper by Gan and Zhang (2006 empirically confirms this hypothesis for the US labour markets. Agglomeration can be defined as aggregation of people, basically measured by city size, or as aggregation of firms, measured by cluster size (employment or number of plants. However, the population location and the industrial location are by far more similar in United States than in Europe and in Italy. Our paper aims to evaluate the effects of agglomeration on the local unemployment rate. The new methodological contribution of the study is the identification of both urban and industrial cluster agglomeration effects, using a wide set of control variables. Adjusting the system for the effects of sectorial and size shocks, as well as those relating to geographic structure and policy interventions, the results of our analysis differ from that for the United States. The study stresses the presence of negative and significant urbanisation externalities. We obtain, instead, positive effects concerning the geographic agglomeration of firms, and their thickness, in a specific area. Furthermore, positive and significant effects can be found in local systems with features of a district. Finally, the model distinguishes the negative effects of urban agglomerations (in terms of population density from positive firm’s agglomerations (in terms of density of local units.

  3. Temporal associations between weather and headache: analysis by empirical mode decomposition.

    Directory of Open Access Journals (Sweden)

    Albert C Yang

    Full Text Available BACKGROUND: Patients frequently report that weather changes trigger headache or worsen existing headache symptoms. Recently, the method of empirical mode decomposition (EMD has been used to delineate temporal relationships in certain diseases, and we applied this technique to identify intrinsic weather components associated with headache incidence data derived from a large-scale epidemiological survey of headache in the Greater Taipei area. METHODOLOGY/PRINCIPAL FINDINGS: The study sample consisted of 52 randomly selected headache patients. The weather time-series parameters were detrended by the EMD method into a set of embedded oscillatory components, i.e. intrinsic mode functions (IMFs. Multiple linear regression models with forward stepwise methods were used to analyze the temporal associations between weather and headaches. We found no associations between the raw time series of weather variables and headache incidence. For decomposed intrinsic weather IMFs, temperature, sunshine duration, humidity, pressure, and maximal wind speed were associated with headache incidence during the cold period, whereas only maximal wind speed was associated during the warm period. In analyses examining all significant weather variables, IMFs derived from temperature and sunshine duration data accounted for up to 33.3% of the variance in headache incidence during the cold period. The association of headache incidence and weather IMFs in the cold period coincided with the cold fronts. CONCLUSIONS/SIGNIFICANCE: Using EMD analysis, we found a significant association between headache and intrinsic weather components, which was not detected by direct comparisons of raw weather data. Contributing weather parameters may vary in different geographic regions and different seasons.

  4. Capital structure and value firm: an empirical analysis of abnormal returns

    Directory of Open Access Journals (Sweden)

    Faris Nasif AL-SHUBIRI

    2010-12-01

    Full Text Available This study investigates whether capital structure is value relevant for the equity investor. In this sense, the paper links empirical corporate finance issues with investment analysis. This study also integrates the Miller-Modigliani (MM framework (1958 into an investment approach by estimating abnormal returns on leverage portfolios in the time-series for different risk classes. For most risk classes, abnormal returns decline in firm leverage. Descriptive statistics, simple and multiple regressions are used to test the hold indicator significance. The results reflect that the designed measures are the negative relationship between returns and leverage could also be due to the market’s pricing of the firm’s ability to raise funds if need be. Further avenues for research in this area include examining the stock return performance of companies based on the changes in leverage of the firms relative to their risk classes. It would be particularly noteworthy to examine the rate at which the information content of said changes is incorporated in the share prices of companies as well as in their long run returns This study encompasses all non-financial firms across the five sectors that cover all the various classes of risk. This study investigates neither the determinants of multiple capital structure choices nor changes in capital structures over time. Our main goal is to explore the effect of capital structure on cumulative abnormal returns. This study also examine a firm’s cumulative average abnormal returns by measuring leverage at the firm level and at the average level for the firm’s industry. And also examine other factors, such as size, price earnings, market-to-book and betas.

  5. Multi-sample nonparametric treatments comparison in medical ...

    African Journals Online (AJOL)

    Multi-sample nonparametric treatments comparison in medical follow-up study with unequal observation processes through simulation and bladder tumour case study. P. L. Tan, N.A. Ibrahim, M.B. Adam, J. Arasan ...

  6. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  7. The money creation process: A theoretical and empirical analysis for the US

    OpenAIRE

    Levrero, Enrico Sergio; Deleidi, Matteo

    2017-01-01

    The aim of this paper is to assess – on both theoretical and empirical grounds – the two main views regarding the money creation process,namely the endogenous and exogenous money approaches. After analysing the main issues and the related empirical literature, we will apply a VAR and VECM methodology to the United States in the period 1959-2016 to assess the causal relationship between a number of critical variables that are supposed to determine the money supply, i.e., the monetary base, ban...

  8. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study.

    Science.gov (United States)

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias; Salanti, Georgia

    2018-02-28

    To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) ("living" network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (Pmeta-analyses were performed for each selected comparison. Monitoring boundaries of statistical significance were constructed and the evidence against the null hypothesis was considered to be strong when the monitoring boundaries were crossed. A significance level was defined as α=5%, power of 90% (β=10%), and an anticipated treatment effect to detect equal to the final estimate from the network meta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided

  9. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study

    Science.gov (United States)

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias

    2018-01-01

    Abstract Objective To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) (“living” network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Design Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Data sources Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Eligibility criteria for study selection Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (Pmeta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. Results 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided strong evidence against the null hypothesis (P=0.002). The median time to strong evidence against the null hypothesis was 19 years with living network meta-analysis and 23 years with living pairwise meta-analysis (hazard ratio 2.78, 95% confidence interval 1.00 to 7.72, P=0.05). Studies directly comparing

  10. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  11. Essays in economics of energy efficiency in residential buildings - An empirical analysis[Dissertation 17157

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, M.

    2007-07-01

    Energy efficiency in the building sector is a key element of cost-effective climate change and energy policies in most countries throughout the world. (...) However, a gap between the cost-effectiveness of energy efficiency measures, their benefits, the necessities from a societal point of view on the one hand and the actual investments in the building stock - particularly in the moment of re-investment and refurbishing - on the other hand became more and more evident. The research questions that arose against this background were whether this gap and the low energy efficiency levels and rates could be confirmed empirically and if yes, how the gap could be explained and how it could be overcome by adequate policy measures. To address these questions, the multi-functional character of buildings (i.e. well conditioned and quiet living rooms and working space) had to be considered. Associated benefits arise on the societal level (ancillary benefits) and on the private level (co-benefits), the latter being increasingly addressed by different building labels such as 'Minergie', 'Passive House', and others. It was assumed that these co-benefits are of economic relevance, but empirical evidence regarding their economic value was missing. Thus, putting these benefits into an appropriate economic appraisal framework was at stake to make use of them in market information and policy instruments, preventing uninformed and biased cost benefit analyses and decisions on the private and on the societal level. The research presented in this PhD thesis had the goal to provide a sound empirical basis about costs and benefits of energy efficiency investments in residential buildings, with a special emphasis on the economic valuation of their co-benefits from a building user perspective (owner-occupiers, purchasers and tenants). In view of long time-horizons in the building sector, the techno-economic dynamics should also be addressed. The results should be useful

  12. A Systematic Analysis and Synthesis of the Empirical MOOC Literature Published in 2013-2015

    Science.gov (United States)

    Veletsianos, George; Shepherdson, Peter

    2016-01-01

    A deluge of empirical research became available on MOOCs in 2013-2015 and this research is available in disparate sources. This paper addresses a number of gaps in the scholarly understanding of MOOCs and presents a comprehensive picture of the literature by examining the geographic distribution, publication outlets, citations, data collection and…

  13. Libor and Swap Market Models for the Pricing of Interest Rate Derivatives : An Empirical Analysis

    NARCIS (Netherlands)

    de Jong, F.C.J.M.; Driessen, J.J.A.G.; Pelsser, A.

    2000-01-01

    In this paper we empirically analyze and compare the Libor and Swap Market Models, developed by Brace, Gatarek, and Musiela (1997) and Jamshidian (1997), using paneldata on prices of US caplets and swaptions.A Libor Market Model can directly be calibrated to observed prices of caplets, whereas a

  14. An Empirical Analysis of Differences in GDP per Capita and the Role of Human Capital

    Science.gov (United States)

    Sfakianakis, George; Magoutas, Anastasios I.; Georgopoulos, Demosthenes

    2010-01-01

    Using a generalized production function approach and insights from empirical research on the determinants of growth, this paper assesses the relative importance of specific factors in explaining differences in the levels of per capita GDP. Emphasis is placed on education, physical capital accumulation, the share of the public sector in economic…

  15. The impact of category prices on store price image formation : An empirical analysis

    NARCIS (Netherlands)

    Da Silva Lourenço, C.J.; Gijsbrechts, E.; Paap, R.

    2015-01-01

    The authors empirically explore how consumers update beliefs about a store's overall expensiveness. They estimate a learning model of store price image (SPI) formation with the impact of actual prices linked to category characteristics, on a unique dataset combining store visit and purchase

  16. An Empirical Analysis of the Role of the Trading Intensity in Information Dissemination on the NYSE

    NARCIS (Netherlands)

    Spierdijk, L.

    2002-01-01

    Asymmetric information models predict comovements among trade characteristics such as returns, bid-ask spread, and trade volume on one hand and the trading intensity on the other hand.In this paper we investigate empirically the two-sided causality between trade characteristics and trading

  17. Academic Staff Quality in Higher Education: An Empirical Analysis of Portuguese Public Administration Education

    Science.gov (United States)

    Sarrico, Cláudia S.; Alves, André A.

    2016-01-01

    Higher education accreditation frameworks typically consider academic staff quality a key element. This article embarks on an empirical study of what academic staff quality means, how it is measured, and how different aspects of staff quality relate to each other. It draws on the relatively nascent Portuguese experience with study programme…

  18. How Certain are Dutch Households about Future Income? An Empirical Analysis

    NARCIS (Netherlands)

    Das, J.W.M.; Donkers, A.C.D.

    1997-01-01

    The growing literature on precautionary saving clearly indicates the need for measurement of income uncertainty. In this paper we empirically analyze subjective income uncertainty in the Netherlands. Data come from the Dutch VSB panel. We measure income uncertainty directly by asking questions on

  19. Managing Human Resource Capabilities for Sustainable Competitive Advantage: An Empirical Analysis from Indian Global Organisations

    Science.gov (United States)

    Khandekar, Aradhana; Sharma, Anuradha

    2005-01-01

    Purpose: The purpose of this article is to examine the role of human resource capability (HRC) in organisational performance and sustainable competitive advantage (SCA) in Indian global organisations. Design/Methodology/Approach: To carry out the present study, an empirical research on a random sample of 300 line or human resource managers from…

  20. Repatriation Readjustment of International Managers: An Empirical Analysis of HRD Interventions

    Science.gov (United States)

    Osman-Gani, A Ahad M.; Hyder, Akmal S.

    2008-01-01

    Purpose: With increasing interest in overseas business expansion, particularly in the Asia-Pacific region, expatriate management, including repatriation readjustments, has become a critical international human resource development (HRD) issue for multinational enterprises (MNEs). This empirical study therefore aims to investigate the use of HRD…

  1. Empirical analysis of an in-car speed, headway and lane use Advisory system

    NARCIS (Netherlands)

    Schakel, W.J.; Van Arem, B.; Van Lint, J.W.C.

    2014-01-01

    For a recently developed in-car speed, headway and lane use advisory system, this paper investigates empirically advice validity (advice given in correct traffic circumstances), credibility (advice logical to drivers) and frequency. The system has been developed to optimize traffic flow by giving

  2. The value of replicationg the data analysis of an empirical evaluation

    African Journals Online (AJOL)

    The aim of this research was to determine whether the results of an empirical evaluation could be confirmed using a different evaluation method. In this investigation the Qualitative Weight and Sum method used by the researchers Graf and List to evaluate several free and open source e-learning software platforms, were ...

  3. SENSITIVITY ANALYSIS IN FLEXIBLE PAVEMENT PERFORMANCE USING MECHANISTIC EMPIRICAL METHOD (CASE STUDY: CIREBON–LOSARI ROAD SEGMENT, WEST JAVA

    Directory of Open Access Journals (Sweden)

    E. Samad

    2012-02-01

    Full Text Available Cirebon – Losari flexible pavement which is located on the North Coast of Java, Indonesia, is in the severe damage condition caused by overloading vehicles passing the road. The need for developing improved pavement design and analysis methods is very necessary. The increment of loads and quality of material properties can be evaluated through Mechanistic-Empirical (M-E method. M-E software like KENLAYER has been developed to facilitate the transition from empirical to mechanistic design methods. From the KENLAYER analysis, it can be concluded that the effect of overloading to the pavement structure performance is difficult to minimize even though the first two layers have relatively high modulus of elasticity. The occurrence of 150%, 200%, and 250% overloading have a very significant effect in reducing 84%, 95%, and 98% of the pavement design life, respectively. For the purpose of increasing the pavement service life, it is more effective to manage the allowable load.

  4. Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet Process Mixtures.

    Science.gov (United States)

    Filippi, Sarah; Holmes, Chris C; Nieto-Barajas, Luis E

    2016-11-16

    In this article we propose novel Bayesian nonparametric methods using Dirichlet Process Mixture (DPM) models for detecting pairwise dependence between random variables while accounting for uncertainty in the form of the underlying distributions. A key criteria is that the procedures should scale to large data sets. In this regard we find that the formal calculation of the Bayes factor for a dependent-vs.-independent DPM joint probability measure is not feasible computationally. To address this we present Bayesian diagnostic measures for characterising evidence against a "null model" of pairwise independence. In simulation studies, as well as for a real data analysis, we show that our approach provides a useful tool for the exploratory nonparametric Bayesian analysis of large multivariate data sets.

  5. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee

    2010-01-01

    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  6. Bioactive conformational generation of small molecules: A comparative analysis between force-field and multiple empirical criteria based methods

    Directory of Open Access Journals (Sweden)

    Jiang Hualiang

    2010-11-01

    Full Text Available Abstract Background Conformational sampling for small molecules plays an essential role in drug discovery research pipeline. Based on multi-objective evolution algorithm (MOEA, we have developed a conformational generation method called Cyndi in the previous study. In this work, in addition to Tripos force field in the previous version, Cyndi was updated by incorporation of MMFF94 force field to assess the conformational energy more rationally. With two force fields against a larger dataset of 742 bioactive conformations of small ligands extracted from PDB, a comparative analysis was performed between pure force field based method (FFBM and multiple empirical criteria based method (MECBM hybrided with different force fields. Results Our analysis reveals that incorporating multiple empirical rules can significantly improve the accuracy of conformational generation. MECBM, which takes both empirical and force field criteria as the objective functions, can reproduce about 54% (within 1Å RMSD of the bioactive conformations in the 742-molecule testset, much higher than that of pure force field method (FFBM, about 37%. On the other hand, MECBM achieved a more complete and efficient sampling of the conformational space because the average size of unique conformations ensemble per molecule is about 6 times larger than that of FFBM, while the time scale for conformational generation is nearly the same as FFBM. Furthermore, as a complementary comparison study between the methods with and without empirical biases, we also tested the performance of the three conformational generation methods in MacroModel in combination with different force fields. Compared with the methods in MacroModel, MECBM is more competitive in retrieving the bioactive conformations in light of accuracy but has much lower computational cost. Conclusions By incorporating different energy terms with several empirical criteria, the MECBM method can produce more reasonable conformational

  7. Transition redshift: new constraints from parametric and nonparametric methods

    Energy Technology Data Exchange (ETDEWEB)

    Rani, Nisha; Mahajan, Shobhit; Mukherjee, Amitabha [Department of Physics and Astrophysics, University of Delhi, New Delhi 110007 (India); Jain, Deepak [Deen Dayal Upadhyaya College, University of Delhi, New Delhi 110015 (India); Pires, Nilza, E-mail: nrani@physics.du.ac.in, E-mail: djain@ddu.du.ac.in, E-mail: shobhit.mahajan@gmail.com, E-mail: amimukh@gmail.com, E-mail: npires@dfte.ufrn.br [Departamento de Física Teórica e Experimental, UFRN, Campus Universitário, Natal, RN 59072-970 (Brazil)

    2015-12-01

    In this paper, we use the cosmokinematics approach to study the accelerated expansion of the Universe. This is a model independent approach and depends only on the assumption that the Universe is homogeneous and isotropic and is described by the FRW metric. We parametrize the deceleration parameter, q(z), to constrain the transition redshift (z{sub t}) at which the expansion of the Universe goes from a decelerating to an accelerating phase. We use three different parametrizations of q(z) namely, q{sub I}(z)=q{sub 1}+q{sub 2}z, q{sub II} (z) = q{sub 3} + q{sub 4} ln (1 + z) and q{sub III} (z)=½+q{sub 5}/(1+z){sup 2}. A joint analysis of the age of galaxies, strong lensing and supernovae Ia data indicates that the transition redshift is less than unity i.e. z{sub t} < 1. We also use a nonparametric approach (LOESS+SIMEX) to constrain z{sub t}. This too gives z{sub t} < 1 which is consistent with the value obtained by the parametric approach.

  8. Nonparametric Integrated Agrometeorological Drought Monitoring: Model Development and Application

    Science.gov (United States)

    Zhang, Qiang; Li, Qin; Singh, Vijay P.; Shi, Peijun; Huang, Qingzhong; Sun, Peng

    2018-01-01

    Drought is a major natural hazard that has massive impacts on the society. How to monitor drought is critical for its mitigation and early warning. This study proposed a modified version of the multivariate standardized drought index (MSDI) based on precipitation, evapotranspiration, and soil moisture, i.e., modified multivariate standardized drought index (MMSDI). This study also used nonparametric joint probability distribution analysis. Comparisons were done between standardized precipitation evapotranspiration index (SPEI), standardized soil moisture index (SSMI), MSDI, and MMSDI, and real-world observed drought regimes. Results indicated that MMSDI detected droughts that SPEI and/or SSMI failed to do. Also, MMSDI detected almost all droughts that were identified by SPEI and SSMI. Further, droughts detected by MMSDI were similar to real-world observed droughts in terms of drought intensity and drought-affected area. When compared to MMSDI, MSDI has the potential to overestimate drought intensity and drought-affected area across China, which should be attributed to exclusion of the evapotranspiration components from estimation of drought intensity. Therefore, MMSDI is proposed for drought monitoring that can detect agrometeorological droughts. Results of this study provide a framework for integrated drought monitoring in other regions of the world and can help to develop drought mitigation.

  9. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2018-01-30

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. [Rationalization and rationing at the bedside. A normative and empirical status quo analysis].

    Science.gov (United States)

    Strech, D

    2014-02-01

    The topic of bedside rationing is increasingly discussed in Germany. Further need for clarification exists for the question how bedside rationing (e.g., in the area of overcare) can be justified despite coexistent inefficiencies. This paper outlines and analyses the relationship of waste avoidance and rationing from an ethical perspective. Empirical findings regarding the status quo of bedside rationing and rationalization are presented. These normative and empirical explorations will then be further specified regarding opportunities for future physician-driven activities to tackle overuse. The self-government partners in Germany should communicate more explicitly within their communities and to the public how and with which benchmarks they aim to reduce inefficient health care (overuse) in an appropriate manner. Physician-driven activities such as the "Choosing Wisely®" initiative in the USA could provide a first step to raise the awareness for overuse among physicians as well as in the public.

  11. Space evolution model and empirical analysis of an urban public transport network

    Science.gov (United States)

    Sui, Yi; Shao, Feng-jing; Sun, Ren-cheng; Li, Shu-jing

    2012-07-01

    This study explores the space evolution of an urban public transport network, using empirical evidence and a simulation model validated on that data. Public transport patterns primarily depend on traffic spatial-distribution, demands of passengers and expected utility of investors. Evolution is an iterative process of satisfying the needs of passengers and investors based on a given traffic spatial-distribution. The temporal change of urban public transport network is evaluated both using topological measures and spatial ones. The simulation model is validated using empirical data from nine big cities in China. Statistical analyses on topological and spatial attributes suggest that an evolution network with traffic demands characterized by power-law numerical values which distribute in a mode of concentric circles tallies well with these nine cities.

  12. Ambiguity and Investment Decisions: An Empirical Analysis on Mutual Fund Investor Behaviour

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2017-09-01

    Full Text Available The paper empirically studies the relationship between ambiguity and mutual fund investor behaviour. Theoretical models for investment decisions incorporating ambiguity motivate our analyses. While the models indicate that investors would less likely to invest in financial markets when ambiguity increases, there is rare empirical evidence in natural occurring financial data to examine this hypothesis. In this paper, we test the hypothesis with equity fund flow data as for investment decisions and ambiguity with the degree of disagreement in equity analysts’ prediction about asset returns. Our results support the hypothesis that increases in ambiguity could lead to less fund flows and this result remains consistently when adding various control variables affecting fund flows. Besides, we find that heterogeneous impacts of ambiguity: equity funds with high yield targets and active management style are affected more than funds investing in stable stocks; funds with larger proportion of institutional investors are more sensitive and affected by the ambiguity.

  13. Analysis of consumption behaviour concerning current income and lags consumption: Empirical evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul Qayyum Khan

    2014-10-01

    Full Text Available As in other economies, consumption expenditure is the largest component of the Gross Domestic Product (GDP of Pakistan economy. The figure has been estimated around 80 percent of the GDP and demonstrates that historically, Pakistan’s economic growth is characterized as consumption-led growth. The present paper aims to explore the relationship between income and consumption using annual time series data for the period: 1975 to 2012 in Pakistan. For empirical investigation the linear regression model and the method of Least Squares is used as analytical techniques. Empirical results support the existence of a significant positive relationship between income and consumption. The finding suggests that long term committed planning is indispensable to enhance the productive capacity of the economy, employment opportunities and reduce poverty levels more effectively.

  14. Endangering of Businesses by the German Inheritance Tax? – An Empirical Analysis

    OpenAIRE

    Houben, Henriette; Maiterth, Ralf

    2011-01-01

    This contribution addresses the substantial tax privilege for businesses introduced by the German Inheritance Tax Act 2009. Advocates of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquidity to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax law has not enda...

  15. An empirical analysis of gasoline demand in Denmark using cointegration techniques

    International Nuclear Information System (INIS)

    Bentzen, Jan

    1994-01-01

    Danish time-series data covering the period 1948-91 are used in order to estimate short-run and long-run elasticities in gasoline demand. A cointegration test for a stable long-run relationship between the variables in the model proves to be positive, showing a smaller value of the long-run price elasticity than often quoted in empirical studies of gasoline demand. Finally, an error correction model is estimated. (author)

  16. An Empirical Analysis of the Interrelationship between Motivation and Stress in the Computing Industry

    OpenAIRE

    Ó Cuirrín, Maitiú

    2007-01-01

    Although a great body of literature exists on the concepts of motivation and stress, no such study has examined the interrelationship between them. The objectives of this thesis are thus, to investigate the factors that motivate/demotivate and cause stress among recently employed computing graduates, as well as examining the implications of these factors both individually and interdependently for both the computing graduate and their employing organisation. An empirical quantitative appro...

  17. Inheritance tax-exempt transfer of German businesses: Imperative or unjustified subsidy? An empirical analysis

    OpenAIRE

    Houben, Henriette; Maiterth, Ralf

    2009-01-01

    This contribution addresses the substantial tax subsidies for businesses introduced by the German Inheritance Tax Act 2009. Advocates in favour of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquid assets to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax la...

  18. Banking Fragility in Colombia: An Empirical Analysis Based on Balance Sheets

    OpenAIRE

    Ignacio Lozano; Alexander Guarín

    2014-01-01

    In this paper, we study the empirical relationship between credit funding sources and the financial vulnerability of the Colombian banking system. We propose a statistical model to measure and predict banking-fragility episodes associated with credit funding sources classified into retail deposits and wholesale funds. We compute the probability of financial fragility for both the aggregated banking system and the individual banks. Our approach performs a Bayesian averaging of estimated logit ...

  19. Empirical analysis of the constituent factors of internal marketing orientation at Spanish hotels

    OpenAIRE

    Robledo, José Luis Ruizalba; Arán, María Vallespín

    2014-01-01

    This study, which draws upon specialized literature and empirical evidence, aims to characterize the underlying structure of the construct ‘Internal Marketing Orientation’ (IMO). As a consequence six underlying factors are identified: exchange of values, segmentation of the internal market, internal communication, management concern, implementation of management concern and training. It also evaluates the degree of Spanish hotels’ IMO, classifying them into three different groups according to...

  20. THE PROVINCIALISM OF GLOBAL BRANDS AN EMPIRICAL ANALYSIS OF BRAND EQUITY DIFFERENCES IN MEXICO AND GERMANY

    OpenAIRE

    Thomas Cleff; Lena Fischer; Nadine Walter

    2010-01-01

    The term “global brand” has become widely used by the media and by consumers. Although media and consumers call these brands “global” and centralized marketing departments manage these brands globally – are these “global brands” really global? Can we talk about truly global brand equity? And if there were brand image differences between countries, which factors cause them? The authors conducted an empirical research during May and June 2009 with similarly aged University students in Germany (...

  1. Exchange rate policy and external debt in emerging economies: an empirical analysis

    OpenAIRE

    Cebir, Bilgen

    2012-01-01

    In this thesis, we empirically analyze the e ects of exchange rate policy on external debt accumulation in emerging market economies with a sample of 15 countries over the period 1998-2010. The exchange rate policy is captured by the de facto exchange rate classi cation of Ilzetzki, Reinhart, and Rogo (2008). This classification is based on the actual exchange rate behavior rather than the officially declared regimes. Therefore, it is expected to better reflect the exchange rate policies act...

  2. Micro-Credit and Rural Poverty: An Analysis of Empirical Evidence

    OpenAIRE

    Chavan, P.; Ramakumar, R.

    2003-01-01

    This paper reviews empirical evidence on NGO-led micro-credit programmes in several developing countries, and compares them with state-led poverty alleviation schemes in India. It shows that micro-credit programmes have brought about a marginal improvement in the beneficiaries' income, though technological improvements are lacking due to its emphasis on ‘survival skills'. Also, in Bangladesh the practice of repayment of Grameen Bank loans by making fresh loans from moneylenders has resulted ...

  3. An Empirical Analysis of Socio-Demographic Stratification in Sweetened Carbonated Soft-Drink Purchasing

    OpenAIRE

    Rhodes, Charles

    2012-01-01

    Caloric soft drinks are the number one source of added sugars in U.S. diets, and are associated with many health problems. Three recent years of household purchase, household demographic, and industry advertising data allow Heckit estimation to identify how specific demographic groups vary in their purchase response to marketing of sweetened carbonated soft drinks (sCSDs) at the product category level. Empirical results reveal unique non-linear patterns of household purchase response to sCSD-...

  4. Strategic Management Tools and Techniques: A Comparative Analysis of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Albana Berisha Qehaja

    2017-01-01

    Full Text Available There is no doubt that strategic management tools and techniques are important parts of the strategic management process. Their use in organizations should be observed in a practice-based context. This paper analyzes the empirical studies on the usage of strategic management tools and techniques. Hence, the main aim of this study is to investigate and analyze which enterprises, according to their country development level, use more strategic management tools and techniques and which of these are used the most. Also, this paper investigates which strategic management tools and techniques are used globally according to the results of empirical studies. The study presents a summary of empirical studies for the period 1990–2015. The research results indicate that more strategic tools and techniques are used in developed countries, followed by developing countries and fewest in countries in transition. This study is likely to contribute to the field of strategic management because it summarizes the most used strategic tools and techniques at the global level according to varying stages of countries’ economic development. Also, the findings from this study may be utilized to maximize the full potential of enterprises and reduce the cases of entrepreneurship failures, through creating awareness of the importance of using strategic management tools and techniques.

  5. Corporate Social Responsibility Applied for Rural Development: An Empirical Analysis of Firms from the American Continent

    Directory of Open Access Journals (Sweden)

    Miguel Arato

    2016-01-01

    Full Text Available Corporate Social Responsibility has been recognized by policymakers and development specialists as a feasible driver for rural development. The present paper explores both theoretically and empirically how firms involved in CSR provide development opportunities to rural communities. The research first evaluates the applied literature on the implementation of CSR by private firms and policymakers as means to foster sustainable rural development. The empirical research analyses the CSR activities of 100 firms from a variety of industries, sizes, and countries to determine the type of companies who are involved in rural development and the kind of activities they deployed. Results from the empirical research show that although rural development initiatives are not relevant for all types of companies, a significant number of firms from a variety of industries have engaged in CSR programs supporting rural communities. Firms appear to be interested in stimulating rural development and seem to benefit from it. This paper also includes an exploration of the main challenges and constraints that firms encounter when encouraging rural development initiatives.

  6. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    International Nuclear Information System (INIS)

    Yeh, J-R; Lin, T-Y; Shieh, J-S; Chen, Y; Huang, N E; Wu, Z; Peng, C-K

    2008-01-01

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven

  7. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, J-R; Lin, T-Y; Shieh, J-S [Department of Mechanical Engineering, Yuan Ze University, 135 Far-East Road, Chung-Li, Taoyuan, Taiwan (China); Chen, Y [Far Eastern Memorial Hospital, Taiwan (China); Huang, N E [Research Center for Adaptive Data Analysis, National Central University, Taiwan (China); Wu, Z [Center for Ocean-Land-Atmosphere Studies (United States); Peng, C-K [Beth Israel Deaconess Medical Center, Harvard Medical School (United States)], E-mail: s939205@ mail.yzu.edu.tw

    2008-02-15

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven.

  8. Consistency and Variability in Talk about "Diversity": An Empirical Analysis of Discursive Scope in Swiss Large Scale Enterprises

    Directory of Open Access Journals (Sweden)

    Anja Ostendorp

    2009-02-01

    Full Text Available Traditionally discussions of "diversity" in organizations either refer to an ideal "management" of a diverse workforce or to specific concerns of minorities. The term diversity, however, entails a growing number of translations. Highlighting this diversity of diversity, the concept cannot be merely conceived of as either social-normative or economic-functional. Therefore, the present study empirically scrutinizes the current scope of diversity-talk in Swiss large scale enterprises from a discursive psychological perspective. First, it provides five so-called interpretative repertoires which focus on: image, market, minorities, themes, and difference. Second, it discusses why and how persons oscillate between consistency and variability whenever they draw upon these different repertoires. Finally, it points out possibilities to combine them. This empirical approach to diversity in organizations offers new aspects to the current debate on diversity and introduces crucial concepts of a discursive psychological analysis. URN: urn:nbn:de:0114-fqs090218

  9. Inglorious Empire

    DEFF Research Database (Denmark)

    Khair, Tabish

    2017-01-01

    Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...

  10. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  11. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  12. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  13. An empirical analysis of risk-taking in car driving and other aspects of life

    DEFF Research Database (Denmark)

    Abay, Kibrom Araya; Mannering, Fred

    2016-01-01

    The link between risk-taking behavior in various aspects of life has long been an area of debate among economists and psychologists. Using an extensive data set from Denmark, this study provides an empirical investigation of the link between risky driving and risk taking in other aspects of life...... results in this study suggest that risk-taking behavior in various aspects of life can be associated, and our results corroborate previous evidence on the link between individuals’ risk preferences across various aspects of life. This implies that individuals’ driving behavior, which is commonly...

  14. An Empirical Analysis of Romania’s Comovement with the Euro Zone

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2009-11-01

    Full Text Available In light of adopting the euro in the near future, it is important to asses to which extent the Romanian business cycle evolves in a similar fashion with that of the euro zone. The present study is an empirical investigation of the degree of business cycle synchronization between Romania and the euro area, based on macroeconomic series that capture the cyclical features of the two economies. The results indicate that the most recent period, characterized by major economic and financial turmoil, has lead to an increase of the degree of comovement between of the Romanian economy with that of the euro area.

  15. Estimation of the lifetime distribution of mechatronic systems in the presence of a covariate: A comparison among parametric, semiparametric and nonparametric models

    International Nuclear Information System (INIS)

    Bobrowski, Sebastian; Chen, Hong; Döring, Maik; Jensen, Uwe; Schinköthe, Wolfgang

    2015-01-01

    In practice manufacturers may have lots of failure data of similar products using the same technology basis under different operating conditions. Thus, one can try to derive predictions for the distribution of the lifetime of newly developed components or new application environments through the existing data using regression models based on covariates. Three categories of such regression models are considered: a parametric, a semiparametric and a nonparametric approach. First, we assume that the lifetime is Weibull distributed, where its parameters are modelled as linear functions of the covariate. Second, the Cox proportional hazards model, well-known in Survival Analysis, is applied. Finally, a kernel estimator is used to interpolate between empirical distribution functions. In particular the last case is new in the context of reliability analysis. We propose a goodness of fit measure (GoF), which can be applied to all three types of regression models. Using this GoF measure we discuss a new model selection procedure. To illustrate this method of reliability prediction, the three classes of regression models are applied to real test data of motor experiments. Further the performance of the approaches is investigated by Monte Carlo simulations. - Highlights: • We estimate the lifetime distribution in the presence of a covariate. • Three types of regression models are considered and compared. • A new nonparametric estimator based on our particular data structure is introduced. • We propose a goodness of fit measure and show a new model selection procedure. • A case study with real data and Monte Carlo simulations are performed

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru's Central Coast during the Middle Horizon.

    Science.gov (United States)

    Valverde, Guido; Barreto Romero, María Inés; Flores Espinoza, Isabel; Cooper, Alan; Fehren-Schmitz, Lars; Llamas, Bastien; Haak, Wolfgang

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650-1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region's demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500-700 AD), Wari (Middle Horizon, 800-1000 AD) and Ychsma (Late Intermediate Period, 1000-1450 AD). We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast.

  18. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru's Central Coast during the Middle Horizon.

    Directory of Open Access Journals (Sweden)

    Guido Valverde

    Full Text Available The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650-1100 AD represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region's demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500-700 AD, Wari (Middle Horizon, 800-1000 AD and Ychsma (Late Intermediate Period, 1000-1450 AD. We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast.

  19. Nonparametric regression using the concept of minimum energy

    International Nuclear Information System (INIS)

    Williams, Mike

    2011-01-01

    It has recently been shown that an unbinned distance-based statistic, the energy, can be used to construct an extremely powerful nonparametric multivariate two sample goodness-of-fit test. An extension to this method that makes it possible to perform nonparametric regression using multiple multivariate data sets is presented in this paper. The technique, which is based on the concept of minimizing the energy of the system, permits determination of parameters of interest without the need for parametric expressions of the parent distributions of the data sets. The application and performance of this new method is discussed in the context of some simple example analyses.

  20. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  1. Factors of economic growth in Palestine: an empirical Analysis during the period of (1994-2013

    Directory of Open Access Journals (Sweden)

    Omar Mahmoud Abu-Eideh

    2014-07-01

    Full Text Available This study aimed to analyze the impact of the size of domestic working labour force, real gross domestic capital formation, real domestic exports and imports of goods and services, and political instability on real gross domestic product( RGDP in Palestine during the period of 1994 -2013. To examine the empirical relationship between these explanatory variables and real (GDP growth the study adopted a standardized Cobb- Douglas production function by using the annual official data of the Palestinian Central Bureau of Statistics (PCBS, and applying the Ordinary least Square method (OLS and Second Order Auto Correlation Techniques. The empirical results of the model applied indicated that there is a positive relationship between the size of domestic working labour force, real gross domestic capital formation, real domestic exports and real gross domestic product( RGDP, and a negative relationship between real domestic imports of goods and services, and political instability and the real growth of (GDP. The study suggested several recommendations that can boost the level of growth, among them the most important one, is the urgent need for more investment in the economy as it leads to more formation of domestic capital which can count more in terms of economic growth in many ways.

  2. An empirical analysis of price expectations formation: Evidence from the crude oil reserves acquisitions market

    International Nuclear Information System (INIS)

    Vielhaber, L.M.

    1991-01-01

    Reasons for the recent scant empirical attention to price expectations theory are twofold. First, except for futures markets and the occasional expectations survey, price expectations are rarely documented. Second, results of empirical tests of rational expectations are fundamentally flawed by the subjective input of the researcher. Subjectivity taints the results of the test, first, in the form of model specification and, second, in the form of the identification of the relevant information set. This study addresses each of these shortcomings. First, crude oil price expectations are recovered in the market for reserves by using a standard engineering model commonly used in reserves evaluation. Second, the crude oil futures market is used to estimate an index of information. This index circumvents the need to subjectively identify the elements of the information set, removing a key source of subjective input. The results show that agents involved in the crude oil reserves acquisitions market form expectations of futures prices in a way that does not conform with the adaptive expectations model

  3. Measuring health lifestyles in a comparative analysis: theoretical issues and empirical findings.

    Science.gov (United States)

    Abel, T

    1991-01-01

    The concept of lifestyle bears great potential for research in medical sociology. Yet, weaknesses in current methods have restrained lifestyle research from realizing its full potentials. The present focus is on the links between theoretical conceptions and their empirical application. The paper divides into two parts. The first part provides a discussion of basic theoretical and methodological issues. In particular selected lines of thought from Max Weber are presented and their usefulness in providing a theoretical frame of reference for health lifestyle research is outlined. Next, a theory guided definition of the subject matter is introduced and basic problems in empirical applications of theoretical lifestyle concepts are discussed. In its second part the paper presents findings from comparative lifestyle analyses. Data from the U.S. and West Germany are utilized to explore issues of measurement equivalence and theoretical validity. Factor analyses indicate high conceptual equivalence for new measures of health lifestyle dimensions in both the U.S. and West Germany. Divisive cluster analyses detect three distinct lifestyle groups in both nations. Implications for future lifestyle research are discussed.

  4. A new powerful non-parametric two-stage approach for testing multiple phenotypes in family-based association studies

    NARCIS (Netherlands)

    Lange, C; Lyon, H; DeMeo, D; Raby, B; Silverman, EK; Weiss, ST

    2003-01-01

    We introduce a new powerful nonparametric testing strategy for family-based association studies in which multiple quantitative traits are recorded and the phenotype with the strongest genetic component is not known prior to the analysis. In the first stage, using a population-based test based on the

  5. Electronic contributions to the transport properties and specific heat of solid UO2: an empirical, self-consistent analysis

    International Nuclear Information System (INIS)

    Hyland, G.J.; Ralph, J.

    1982-07-01

    From an empirical, self-consistent analysis of new high temperature data on the thermo-electric Seebeck coefficient and d.c. electrical conductivity, the value of the free energy controlling the equilibrium of the thermally induced reaction, 2U 4+ reversible U 3+ + U 5+ is determined (treating the U 3+ and U 5+ as small polarons) and used to calculate the contribution of the process to the high temperature thermal conductivity and specific heat of UO 2 . It is found that the transport properties can be completely accounted for in this way, but not the anomalous rise in specific heat - the origin of which remains obscure. (U.K.)

  6. Manager’s decision-making in organizations –empirical analysis of bureaucratic vs. learning approach

    OpenAIRE

    Jana Frenová; Daniela Hrehová; Eva Bolfíková

    2010-01-01

    The paper is focused on the study of manager’s decision-making with respect to the basic model of learning organization, presented by P. Senge as a system model of management. On one hand, the empirical research was conducted in connection with key dimensions of organizational learning such as: 1. system thinking, 2. personal mastery, 3. mental models, 4. team learning, 5. building shared vision and 6. dynamics causes. On the other hand, the research was connected with the analysis of the bur...

  7. X-ray spectrum analysis of multi-component samples by a method of fundamental parameters using empirical ratios

    International Nuclear Information System (INIS)

    Karmanov, V.I.

    1986-01-01

    A type of the fundamental parameter method based on empirical relation of corrections for absorption and additional-excitation with absorbing characteristics of samples is suggested. The method is used for X-ray fluorescence analysis of multi-component samples of charges of welded electrodes. It is shown that application of the method is justified only for determination of titanium, calcium and silicon content in charges taking into account only corrections for absorption. Irn and manganese content can be calculated by the simple method of the external standard

  8. Comparative empirical analysis of flow-weighted transit route networks in R-space and evolution modeling

    Science.gov (United States)

    Huang, Ailing; Zang, Guangzhi; He, Zhengbing; Guan, Wei

    2017-05-01

    Urban public transit system is a typical mixed complex network with dynamic flow, and its evolution should be a process coupling topological structure with flow dynamics, which has received little attention. This paper presents the R-space to make a comparative empirical analysis on Beijing’s flow-weighted transit route network (TRN) and we found that both the Beijing’s TRNs in the year of 2011 and 2015 exhibit the scale-free properties. As such, we propose an evolution model driven by flow to simulate the development of TRNs with consideration of the passengers’ dynamical behaviors triggered by topological change. The model simulates that the evolution of TRN is an iterative process. At each time step, a certain number of new routes are generated driven by travel demands, which leads to dynamical evolution of new routes’ flow and triggers perturbation in nearby routes that will further impact the next round of opening new routes. We present the theoretical analysis based on the mean-field theory, as well as the numerical simulation for this model. The results obtained agree well with our empirical analysis results, which indicate that our model can simulate the TRN evolution with scale-free properties for distributions of node’s strength and degree. The purpose of this paper is to illustrate the global evolutional mechanism of transit network that will be used to exploit planning and design strategies for real TRNs.

  9. Endangering of Businesses by the German Inheritance Tax? – An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Henriette Houben

    2011-04-01

    Full Text Available This contribution addresses the substantial tax privilege for businesses introduced by the German Inheritance Tax Act 2009. Advocates of the vast or even entire tax exemption for businesses stress the potential damage of the inheritance tax on businesses, as those often lack liquidity to meet tax liability. This submission tackles this issue empirically based on data of the German Inheritance Tax Statistics and the SOEP. The results indicate that former German inheritance tax law has not endangered transferred businesses. Hence, there is no need for the tremendous tax privilege for businesses in current German inheritance tax law. An alternative flat inheritance tax without tax privileges, which meets revenue neutrality per tax class according to current tax law, provokes in some cases relative high tax loads which might trouble businesses.

  10. Competition policies and environmental quality: Empirical analysis of the electricity sector in OECD countries

    International Nuclear Information System (INIS)

    Asane-Otoo, Emmanuel

    2016-01-01

    Over the last decades, electricity markets across OECD countries have been subjected to profound structural changes with far-reaching implications on the economy and the environment. This paper investigates the effect of restructuring – changes in entry regulations, the degree of vertical integration and ownership structure – on GHG emissions. The findings show that competition policies – particularly reducing the degree of vertical integration and increasing privatization – correlate negatively with emission intensity. However, the environmental effect of reducing market entry barriers is generally insignificant. Integration of competition and stringent environmental policies are required to reduce GHG emissions and improve environmental quality. - Highlights: •Empirical study on competition policies and GHG emissions from the electricity sector. •Product market regulation scores for OECD countries are used to measure the extent of competition. •Evidence of a positive relationship between competition policies and environmental quality. •Integration of competition and stringent environmental policies is recommended.

  11. The impact of e-ticketing technique on customer satisfaction: an empirical analysis

    Directory of Open Access Journals (Sweden)

    Mazen Kamal Qteishat

    2015-09-01

    Full Text Available Recently, internet technology is considered to be the most used information and communication technology by organizations: it can ease the process of transactions and reinforce the relation between companies and customers. This investigation empirically examines the impact of e-ticketing technique on customer satisfaction; a convenience sample of Jordanian airline passengers that had booked flights in the last 12 months through companies offering e-ticketing services was acquired. The findings indicate that customer satisfaction with e-ticketing services was influenced by all of the independent variables measured (Data security, Customer and Technical Support, and User-Friendliness were noted to have significant impact on customer satisfaction with e-ticketing services.

  12. Empirical correction of crosstalk in a low-background germanium γ-γ analysis system

    International Nuclear Information System (INIS)

    Keillor, M.E.; Erikson, L.E.; Aalseth, C.E.; Day, A.R.; Fuller, E.S.; Glasgow, B.D.; Hoppe, E.W.; Hossbach, T.W.; Mizouni, L.K.; Myers, A.W.

    2013-01-01

    The Pacific Northwest National Laboratory (PNNL) is currently developing a custom software suite capable of automating many of the tasks required to accurately analyze coincident signals within gamma spectrometer arrays. During the course of this work, significant crosstalk was identified in the energy determination for spectra collected with a new low-background intrinsic germanium (HPGe) array at PNNL. The HPGe array is designed for high detection efficiency, ultra-low-background performance, and sensitive γ-γ coincidence detection. The first half of the array, a single cryostat containing seven HPGe crystals, was recently installed into a new shallow underground laboratory facility. This update will present a brief review of the germanium array, describe the observed crosstalk, and present a straight-forward empirical correction that significantly reduces the impact of this crosstalk on the spectroscopic performance of the system. (author)

  13. DOES ENERGY CONSUMPTION VOLATILITY AFFECT REAL GDP VOLATILITY? AN EMPIRICAL ANALYSIS FOR THE UK

    Directory of Open Access Journals (Sweden)

    Abdul Rashid

    2013-10-01

    Full Text Available This paper empirically examines the relation between energy consumption volatility and unpredictable variations in real gross domestic product (GDP in the UK. Estimating the Markov switching ARCH model we find a significant regime switching in the behavior of both energy consumption and GDP volatility. The results from the Markov regime-switching model show that the variability of energy consumption has a significant role to play in determining the behavior of GDP volatilities. Moreover, the results suggest that the impacts of unpredictable variations in energy consumption on GDP volatility are asymmetric, depending on the intensity of volatility. In particular, we find that while there is no significant contemporaneous relationship between energy consumption volatility and GDP volatility in the first (low-volatility regime, GDP volatility is significantly positively related to the volatility of energy utilization in the second (high-volatility regime.

  14. Beyond the use of food supplements: An empirical analysis in Italy

    Directory of Open Access Journals (Sweden)

    A. LOMBARDI

    2016-03-01

    Full Text Available This paper aims to profile Italian food supplements used by consumers based upon their psychometric patterns and demographic characteristics. The FTNS scale is used to assess empirically and evaluate the role of technophobic/technophilic consumer traits in determining the decision whether or not to consume supplements and vitamins and the frequency of their consumption.An ad-hoc survey was carried out in 2012 involving 400 residents of a metropolitan area in southern Italy. Our results show that women have a higher consumption frequency of dietary supplements, while age, BMI and education influence the propensity to consume. As regards food habits, the propensity to use dietary supplements is positively associated to the consumption of bread and pasta, red meat and pulses, and negatively with the consumption of fruit and cheese.Finally, the research supports the role of technophobic traits as consistent and significant determinants of the consumption frequency of dietary supplements.

  15. An empirical analysis of gasoline price convergence for 20 OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Bentzen, J.

    2003-07-01

    Two decades have passed now since the oil price shocks of the 1970s and since then energy prices have - apart from short periods of price instability - evolved relatively smoothly in the industrialized countries. Energy taxes in many countries differ markedly thereby causing differences in final energy prices, but as similar tax levels are becoming more common, e.g. in the European Union, convergence concerning energy prices might be expected to appear. In the present paper national gasoline price data covering the time period since the 1970s for a sample of OECD countries are used in order to test for this often addressed topic of convergence. The empirical part of the paper applies different time series based tests of convergence, where gasoline prices exhibit convergence for most OECD-Europe countries in the case where US$ is used for measurement of the energy prices indicating a convergence or tax harmonization process is taking place for these countries. (au)

  16. Factors Affecting the Adoption of Mobile Payment Systems: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    İkram Daştan

    2016-02-01

    Full Text Available The world witnessed a rapid growth in the e-commerce in the recent years. Widespread use of mobile devices in the e-commerce has a role in this augmentation. Associated with growth of trading volume and the introduction of new devices, new products and solutions emerge and they diversify concerning online payments. Consumer attitudes and behaviors may change according to these developments. The purpose of this study is to investigate the factors effecting adoption of mobile payment systems by the consumer. 225 individuals were surveyed online through convenience sampling method. A research model was developed and proposed relationships were tested using structural equation modeling. The empirical findings point out that perceived trust, perceived mobility and attitudes positively affect the adoption of MPS; perceived usefulness and perceived ease of use have no effect on adoption of MPS. Furthermore perceived reputation positively related to perceived trust and finally environmental risk negatively related to perceived trust.

  17. Empirical Analysis of Stochastic Volatility Model by Hybrid Monte Carlo Algorithm

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2013-01-01

    The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model.

  18. An empirical analysis on the adoption of alternative fuel vehicles: The case of natural gas vehicles

    International Nuclear Information System (INIS)

    Yeh, Sonia

    2007-01-01

    The adoption of alternative fuel vehicles (AFVs) has been regarded as one of the most important strategies to address the issues of energy dependence, air quality, and, more recently, climate change. Despite decades of effort, we still face daunting challenges to promote wider acceptance of AFVs by the general public. More empirical analyses are needed to understand the technology adoption process associated with different market structures, the effectiveness of regulations and incentives, and the density of infrastructure adequate to reach sustainable commercial application. This paper compares the adoption of natural gas vehicles (NGVs) in eight countries: Argentina, Brazil, China, India, Italy, New Zealand, Pakistan, and the US. It examines the major policies aimed at promoting the use of NGVs, instruments for implementing those policies and targeting likely stakeholders, and a range of factors that influence the adoption of NGVs. The findings in this paper should be applicable to other AFVs

  19. International Direct Investment and Transboundary Pollution: An Empirical Analysis of Complex Networks

    Directory of Open Access Journals (Sweden)

    Yuping Deng

    2015-04-01

    Full Text Available Using complex networks and spatial econometric methods, we empirically test the extent to which a country’s influence and its position in an international investment network affect environmental quality as well as the country’s role in transboundary pollution transfer. The estimated results show that the ties connecting nodes together in an international investment network have significant impacts on global environmental pollution. Additionally, node linkages between developing countries have stronger negative effects on environmental quality than node linkages between developed countries. Moreover, greater node importance and node centrality accelerate the speed and scale of the growth of polluting industries, which allows developed countries to more easily transfer their pollution-intensive industries to developing countries that possess higher node dependency. We also find that the factor endowment effect coexists with the pollution haven effect, the effects of environmental regulation advantage in the international investment network are greater than the impact of factor endowment advantage.

  20. An empirical analysis of gasoline price convergence for 20 OECD countries

    International Nuclear Information System (INIS)

    Bentzen, J.

    2003-01-01

    Two decades have passed now since the oil price shocks of the 1970s and since then energy prices have - apart from short periods of price instability - evolved relatively smoothly in the industrialized countries. Energy taxes in many countries differ markedly thereby causing differences in final energy prices, but as similar tax levels are becoming more common, e.g. in the European Union, convergence concerning energy prices might be expected to appear. In the present paper national gasoline price data covering the time period since the 1970s for a sample of OECD countries are used in order to test for this often addressed topic of convergence. The empirical part of the paper applies different time series based tests of convergence, where gasoline prices exhibit convergence for most OECD-Europe countries in the case where US$ is used for measurement of the energy prices indicating a convergence or tax harmonization process is taking place for these countries. (au)